Skip to content

Instantly share code, notes, and snippets.

@ryan-williams
Created July 26, 2018 04:33
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ryan-williams/a96bf259898b6260cd4f00b8a232057c to your computer and use it in GitHub Desktop.
Save ryan-williams/a96bf259898b6260cd4f00b8a232057c to your computer and use it in GitHub Desktop.
$ ./gradlew :beam-runners-flink_2.11-job-server:runShadow
Parallel execution is an incubating feature.
Parallel execution with configuration on demand is an incubating feature.
> Configure project :beam-model-pipeline
applyPortabilityNature with default configuration for project beam-model-pipeline
> Configure project :beam-model-fn-execution
applyPortabilityNature with default configuration for project beam-model-fn-execution
> Configure project :beam-model-job-management
applyPortabilityNature with default configuration for project beam-model-job-management
> Task :beam-runners-flink_2.11-job-server:runShadow
Listening for transport dt_socket at address: 5005
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - stored metadata: staging_session_token: "{\"sessionId\":\"job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d\",\"basePath\":\"/tmp/flink-artifacts\"}"
metadata {
name: "pickled_main_session"
md5: "UoIk3316DjrZqF3HP8dyBg=="
}
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Going to stage artifact pickled_main_session to /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c.
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Staging artifact completed for /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c
[grpc-default-executor-0] INFO org.apache.beam.runners.flink.FlinkJobInvoker - Invoking job BeamApp-ryan-0726042817-60555634_66b45173-cafa-4e0b-92e7-60e753d9eb00
[grpc-default-executor-0] INFO org.apache.beam.runners.flink.FlinkJobInvocation - Starting job invocation BeamApp-ryan-0726042817-60555634_66b45173-cafa-4e0b-92e7-60e753d9eb00
[flink-runner-job-server] INFO org.apache.beam.runners.flink.FlinkJobInvocation - Translating pipeline to Flink program.
[grpc-default-executor-0] WARN org.apache.beam.runners.flink.FlinkJobInvocation - addMessageObserver() not yet implemented.
[flink-runner-job-server] INFO org.apache.beam.runners.flink.FlinkExecutionEnvironments - Creating a Batch Execution Environment.
[flink-runner-job-server] INFO org.apache.flink.api.java.ExecutionEnvironment - The job has 0 registered types and 0 default Kryo serializers
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting Flink Mini Cluster
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting Metrics Registry
[flink-runner-job-server] INFO org.apache.flink.runtime.metrics.MetricRegistryImpl - No metrics reporter configured, no metrics will be exposed/reported.
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting RPC Service(s)
[flink-akka.actor.default-dispatcher-2] INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting high-availability services
[flink-runner-job-server] INFO org.apache.flink.runtime.blob.BlobServer - Created BLOB server storage directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/blobStore-de01ad18-4032-4e62-be87-0a40bbca5a10
[flink-runner-job-server] INFO org.apache.flink.runtime.blob.BlobServer - Started BLOB server at 0.0.0.0:58132 - max concurrent requests: 50 - max backlog: 1000
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting ResourceManger
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.resourcemanager.StandaloneResourceManager at akka://flink/user/resourcemanager_6d9c641b-64ef-48b6-a6db-74253e9be23b .
[flink-runner-job-server] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Proposing leadership to contender org.apache.flink.runtime.resourcemanager.StandaloneResourceManager@3fe64d2a @ akka://flink/user/resourcemanager_6d9c641b-64ef-48b6-a6db-74253e9be23b
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - ResourceManager akka://flink/user/resourcemanager_6d9c641b-64ef-48b6-a6db-74253e9be23b was granted leadership with fencing token 838800a029b1b2c31d4755d247ee42fa
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Starting the SlotManager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Received confirmation of leadership for leader akka://flink/user/resourcemanager_6d9c641b-64ef-48b6-a6db-74253e9be23b , session=1d4755d2-47ee-42fa-8388-00a029b1b2c3
[flink-runner-job-server] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Created BLOB cache storage directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/blobStore-5a5a6458-e277-4d92-b154-d4b4863298ad
[flink-runner-job-server] INFO org.apache.flink.runtime.blob.TransientBlobCache - Created BLOB cache storage directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/blobStore-34608611-ee22-4b52-a556-6119d605dc7a
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting 1 TaskManger(s)
[flink-runner-job-server] INFO org.apache.flink.runtime.taskexecutor.TaskManagerServices - Temporary file directory '/var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T': total 931 GB, usable 422 GB (45.33% usable)
[flink-runner-job-server] INFO org.apache.flink.runtime.io.network.buffer.NetworkBufferPool - Allocated 404 MB for network buffer pool (number of memory segments: 12945, bytes per segment: 32768).
[flink-runner-job-server] INFO org.apache.flink.runtime.query.QueryableStateUtils - Could not load Queryable State Client Proxy. Probable reason: flink-queryable-state-runtime is not in the classpath. To enable Queryable State, please move the flink-queryable-state-runtime jar from the opt to the lib folder.
[flink-runner-job-server] INFO org.apache.flink.runtime.query.QueryableStateUtils - Could not load Queryable State Server. Probable reason: flink-queryable-state-runtime is not in the classpath. To enable Queryable State, please move the flink-queryable-state-runtime jar from the opt to the lib folder.
[flink-runner-job-server] INFO org.apache.flink.runtime.io.network.NetworkEnvironment - Starting the network environment and its components.
[flink-runner-job-server] INFO org.apache.flink.runtime.taskexecutor.TaskManagerServices - Limiting managed memory to 0.7 of the currently free heap space (2537 MB), memory will be allocated lazily.
[flink-runner-job-server] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager uses directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/flink-io-6456b2e2-a532-4faa-89e5-6fe8067f0831 for spill files.
[flink-runner-job-server] INFO org.apache.flink.runtime.filecache.FileCache - User file cache uses directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/flink-dist-cache-a6774438-9827-4bc5-995e-4d862226c52d
[flink-runner-job-server] INFO org.apache.flink.runtime.taskexecutor.TaskManagerConfiguration - Messages have a max timeout of 10000 ms
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.taskexecutor.TaskExecutor at akka://flink/user/taskmanager_0 .
[flink-runner-job-server] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Start job leader service.
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting dispatcher rest endpoint.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Connecting to ResourceManager akka://flink/user/resourcemanager_6d9c641b-64ef-48b6-a6db-74253e9be23b(838800a029b1b2c31d4755d247ee42fa).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Resolved ResourceManager address, beginning registration
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Registration at ResourceManager attempt 1 (timeout=100ms)
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Successful registration at resource manager akka://flink/user/resourcemanager_6d9c641b-64ef-48b6-a6db-74253e9be23b under registration id d9d294ec25da1a7d8553c62eb1c8f864.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Registering TaskManager 9387ecaa-1aa3-4c6a-a5a5-78dcb2b86ea1 under d9d294ec25da1a7d8553c62eb1c8f864 at the SlotManager.
[flink-runner-job-server] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Starting rest endpoint.
[flink-runner-job-server] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - Log file environment variable 'log.file' is not set.
[flink-runner-job-server] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - JobManager log files are unavailable in the web dashboard. Log file location not found in environment variable 'log.file' or configuration key 'Key: 'web.log.path' , default: null (deprecated keys: [jobmanager.web.log.path])'.
[flink-runner-job-server] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Failed to load web based job submission extension. Probable reason: flink-runtime-web is not in the classpath.
[flink-runner-job-server] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Rest endpoint listening at localhost:58133
[flink-runner-job-server] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Proposing leadership to contender org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint@57b923f5 @ http://localhost:58133
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - http://localhost:58133 was granted leadership with leaderSessionID=d3f15d7d-f611-4ae5-a284-0f53a4282451
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting job dispatcher(s) for JobManger
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Received confirmation of leadership for leader http://localhost:58133 , session=d3f15d7d-f611-4ae5-a284-0f53a4282451
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.dispatcher.StandaloneDispatcher at akka://flink/user/dispatcher2d71dc22-1c36-44d2-a660-b2a8548810af .
[flink-runner-job-server] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Proposing leadership to contender org.apache.flink.runtime.dispatcher.StandaloneDispatcher@6e8819d9 @ akka://flink/user/dispatcher2d71dc22-1c36-44d2-a660-b2a8548810af
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher akka://flink/user/dispatcher2d71dc22-1c36-44d2-a660-b2a8548810af was granted leadership with fencing token 9cf71f75331e49eb16823e53fb044d13
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all persisted jobs.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Received confirmation of leadership for leader akka://flink/user/dispatcher2d71dc22-1c36-44d2-a660-b2a8548810af , session=16823e53-fb04-4d13-9cf7-1f75331e49eb
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job 5aaef94e469fb53caa5e501440cffd56 (BeamApp-ryan-0726042817-60555634).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_1 .
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.JobMaster - Initializing job BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy NoRestartStrategy for BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.jobmaster.slotpool.SlotPool at akka://flink/user/dc5ca5c1-e296-4039-8ab1-b0d68d56a2f0 .
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master for job BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization on master in 1 ms.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Proposing leadership to contender org.apache.flink.runtime.jobmaster.JobManagerRunner@6a5bd908 @ akka://flink/user/jobmanager_1
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56) was granted leadership with session id a5f2986a-ea32-4ea8-98f2-7bde8b1b664d at akka://flink/user/jobmanager_1.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56) switched from state CREATED to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-1] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Received confirmation of leadership for leader akka://flink/user/jobmanager_1 , session=a5f2986a-ea32-4ea8-98f2-7bde8b1b664d
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Connecting to ResourceManager akka://flink/user/resourcemanager_6d9c641b-64ef-48b6-a6db-74253e9be23b(838800a029b1b2c31d4755d247ee42fa)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Resolved ResourceManager address, beginning registration
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Registration at ResourceManager attempt 1 (timeout=100ms)
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Registering job manager 98f27bde8b1b664da5f2986aea324ea8@akka://flink/user/jobmanager_1 for job 5aaef94e469fb53caa5e501440cffd56.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Cannot serve slot request, no ResourceManager connected. Adding as pending request 15066cce182a271e1683cfd7c3d9f595
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Registered job manager 98f27bde8b1b664da5f2986aea324ea8@akka://flink/user/jobmanager_1 for job 5aaef94e469fb53caa5e501440cffd56.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - JobManager successfully registered at ResourceManager, leader id: 838800a029b1b2c31d4755d247ee42fa.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Requesting slot with profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} from resource manager (request = 15066cce182a271e1683cfd7c3d9f595).
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Request slot with profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} for job 5aaef94e469fb53caa5e501440cffd56 with allocation id bfe758e14b56d64cc9bc31ddfd574942.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Receive slot request bfe758e14b56d64cc9bc31ddfd574942 for job 5aaef94e469fb53caa5e501440cffd56 from resource manager with leader id 838800a029b1b2c31d4755d247ee42fa.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Allocated slot for bfe758e14b56d64cc9bc31ddfd574942.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Add job 5aaef94e469fb53caa5e501440cffd56 for job leader monitoring.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Try to register at job manager akka://flink/user/jobmanager_1 with leader id a5f2986a-ea32-4ea8-98f2-7bde8b1b664d.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Resolved JobManager address, beginning registration
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Registration at JobManager attempt 1 (timeout=100ms)
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Successful registration at job manager akka://flink/user/jobmanager_1 for job 5aaef94e469fb53caa5e501440cffd56.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Establish JobManager connection for job 5aaef94e469fb53caa5e501440cffd56.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Offer reserved slots to the leader of job 5aaef94e469fb53caa5e501440cffd56.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Activate slot bfe758e14b56d64cc9bc31ddfd574942.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1).
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) switched from CREATED to DEPLOYING.
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) [DEPLOYING]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1).
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) switched from CREATED to DEPLOYING.
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) [DEPLOYING]
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) [DEPLOYING].
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) [DEPLOYING].
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) [DEPLOYING].
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) [DEPLOYING].
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) switched from DEPLOYING to RUNNING.
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) switched from DEPLOYING to RUNNING.
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) exceeded the 80 characters length limit and was truncated.
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) exceeded the 80 characters length limit and was truncated.
[jobmanager-future-thread-1] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (attempt #0) to localhost
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) switched from RUNNING to FINISHED.
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) switched from RUNNING to FINISHED.
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92).
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc).
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) [FINISHED]
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) 1b5045002fde9a4886f8a491bb7cbb92.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) 7f737caa370dd3d122ed89a3b46e8dcc.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (1b5045002fde9a4886f8a491bb7cbb92) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7f737caa370dd3d122ed89a3b46e8dcc) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1).
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1).
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) exceeded the 80 characters length limit and was truncated.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull docker image 21c4c2883172
java.io.IOException: Received exit code 1 for command 'docker pull 21c4c2883172'. stderr: Error response from daemon: pull access denied for 21c4c2883172, repository does not exist or may require 'docker login'
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:150)
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:77)
at org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:147)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:174)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:170)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4992)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.forStage(DockerJobBundleFactory.java:183)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.getStageBundleFactory(BatchFlinkExecutableStageContext.java:55)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.open(FlinkExecutableStageFunction.java:96)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:494)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Loading manifest for retrieval token /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Manifest at /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST has 1 artifact locations
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST -> 1 artifacts
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetArtifact name: "pickled_main_session"
retrieval_token: "/tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST"
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Artifact pickled_main_session located in /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Beam Fn Logging client connected.
[grpc-default-executor-0] INFO sdk_worker_main.main - Logging handler created.
[grpc-default-executor-0] INFO sdk_worker_main.main - semi_persistent_directory: /tmp
[grpc-default-executor-0] INFO sdk_worker_main.start - Status HTTP server running at localhost:33775
[grpc-default-executor-0] INFO sdk_worker_main.main - Python sdk harness started with pipeline_options: {u'beam:option:dry_run:v1': False, u'beam:option:harness_docker_image:v1': u'21c4c2883172', u'beam:option:pipeline_type_check:v1': True, u'beam:option:job_endpoint:v1': u'localhost:8099', u'beam:option:dataflow_endpoint:v1': u'https://dataflow.googleapis.com', u'beam:option:runner:v1': None, u'beam:option:sdk_location:v1': u'container', u'beam:option:direct_runner_use_stacked_bundle:v1': True, u'beam:option:runtime_type_check:v1': False, u'beam:option:flink_master:v1': u'[auto]', u'beam:option:save_main_session:v1': True, u'beam:option:type_check_strictness:v1': u'DEFAULT_TO_ANY', u'beam:option:region:v1': u'us-central1', u'beam:option:profile_memory:v1': False, u'beam:option:profile_cpu:v1': False, u'beam:option:app_name:v1': None, u'beam:option:options_id:v1': 1, u'beam:option:no_auth:v1': False, u'beam:option:streaming:v1': False, u'beam:option:experiments:v1': [u'beam_fn_api'], u'beam:option:job_name:v1': u'BeamApp-ryan-0726042817-60555634'}
[grpc-default-executor-0] INFO sdk_worker.__init__ - Creating insecure control channel.
[grpc-default-executor-0] INFO sdk_worker.__init__ - Control channel established.
[grpc-default-executor-0] INFO sdk_worker.__init__ - Initializing SDKHarness with 12 workers.
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - Beam Fn Control client connected with id 1
[grpc-default-executor-1] INFO sdk_worker.run - Got work 1
[grpc-default-executor-0] INFO sdk_worker.run - Got work 2
[grpc-default-executor-1] INFO sdk_worker.run - Got work 3
[grpc-default-executor-1] INFO sdk_worker.run - Got work 4
[grpc-default-executor-1] INFO sdk_worker.create_state_handler - Creating channel for host.docker.internal:58151
[grpc-default-executor-1] INFO data_plane.create_data_channel - Creating channel for host.docker.internal:58150
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client connected.
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Reshuffle/AddRandomKeys output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Split output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/AddRandomKeys output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Split output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Split output_tags=['out'], receivers=[ConsumerSet[read/Read/Split.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Reshuffle/AddRandomKeys output_tags=['out'], receivers=[ConsumerSet[read/Read/Reshuffle/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[ConsumerSet[read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Split output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Split.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/AddRandomKeys output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Reshuffle/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[jobmanager-future-thread-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1).
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) [DEPLOYING]
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) [DEPLOYING].
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc).
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1).
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) exceeded the 80 characters length limit and was truncated.
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) [DEPLOYING]
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) [DEPLOYING].
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) [DEPLOYING].
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) [FINISHED]
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) b813274af2a5a4987a3111d048af35cc.
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) 99ed9d40481e0ca51c6c0b377743a520.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (b813274af2a5a4987a3111d048af35cc) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/21c4c2883172:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (99ed9d40481e0ca51c6c0b377743a520) switched from RUNNING to FINISHED.
[grpc-default-executor-0] INFO sdk_worker.run - No more requests from control plane
[grpc-default-executor-0] INFO sdk_worker.run - SDK Harness waiting for in-flight requests to complete
[grpc-default-executor-1] INFO data_plane.close - Closing all cached grpc data channels.
[grpc-default-executor-1] INFO sdk_worker.close - Closing all cached gRPC state handlers.
[jobmanager-future-thread-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (attempt #0) to localhost
[grpc-default-executor-1] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[grpc-default-executor-0] INFO sdk_worker.run - Done consuming work.
[grpc-default-executor-0] INFO sdk_worker_main.main - Python sdk harness exiting.
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1).
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a).
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) [DEPLOYING]
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) [FINISHED]
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) [DEPLOYING].
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) 1f9f51fffd00dd572ef416b9e1990c4a.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (1f9f51fffd00dd572ef416b9e1990c4a) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) exceeded the 80 characters length limit and was truncated.
[jobmanager-future-thread-1] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (attempt #0) to localhost
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) exceeded the 80 characters length limit and was truncated.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1).
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) exceeded the 80 characters length limit and was truncated.
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a).
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) 72ce29bca0224fe136bbaf6af682e95a.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (72ce29bca0224fe136bbaf6af682e95a) switched from RUNNING to FINISHED.
[Finalizer] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[Finalizer] WARN org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory - Error cleaning up environment url: "21c4c2883172"
java.lang.IllegalStateException: call already closed
at org.apache.beam.vendor.guava.v20.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl.close(ServerCallImpl.java:172)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onCompleted(ServerCalls.java:358)
at org.apache.beam.runners.fnexecution.state.GrpcStateService.close(GrpcStateService.java:54)
at org.apache.beam.runners.fnexecution.GrpcFnServer.close(GrpcFnServer.java:83)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.close(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.lambda$createEnvironmentCache$0(DockerJobBundleFactory.java:163)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1963)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3562)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3538)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3309)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.clear(LocalCache.java:4322)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4937)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.close(DockerJobBundleFactory.java:201)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.finalize(BatchFlinkExecutableStageContext.java:73)
at java.lang.System$2.invokeFinalize(System.java:1270)
at java.lang.ref.Finalizer.runFinalizer(Finalizer.java:98)
at java.lang.ref.Finalizer.access$100(Finalizer.java:34)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:210)
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull docker image 21c4c2883172
java.io.IOException: Received exit code 1 for command 'docker pull 21c4c2883172'. stderr: Error response from daemon: pull access denied for 21c4c2883172, repository does not exist or may require 'docker login'
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:150)
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:77)
at org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:147)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:174)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:170)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4992)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.forStage(DockerJobBundleFactory.java:183)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.getStageBundleFactory(BatchFlinkExecutableStageContext.java:55)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.open(FlinkExecutableStageFunction.java:96)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:494)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST -> 1 artifacts
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetArtifact name: "pickled_main_session"
retrieval_token: "/tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST"
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Artifact pickled_main_session located in /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Beam Fn Logging client connected.
[grpc-default-executor-1] INFO sdk_worker_main.main - Logging handler created.
[grpc-default-executor-1] INFO sdk_worker_main.main - semi_persistent_directory: /tmp
[grpc-default-executor-1] INFO sdk_worker_main.start - Status HTTP server running at localhost:33391
[grpc-default-executor-1] INFO sdk_worker_main.main - Python sdk harness started with pipeline_options: {u'beam:option:dry_run:v1': False, u'beam:option:harness_docker_image:v1': u'21c4c2883172', u'beam:option:pipeline_type_check:v1': True, u'beam:option:job_endpoint:v1': u'localhost:8099', u'beam:option:dataflow_endpoint:v1': u'https://dataflow.googleapis.com', u'beam:option:runner:v1': None, u'beam:option:sdk_location:v1': u'container', u'beam:option:direct_runner_use_stacked_bundle:v1': True, u'beam:option:runtime_type_check:v1': False, u'beam:option:flink_master:v1': u'[auto]', u'beam:option:save_main_session:v1': True, u'beam:option:type_check_strictness:v1': u'DEFAULT_TO_ANY', u'beam:option:region:v1': u'us-central1', u'beam:option:profile_memory:v1': False, u'beam:option:profile_cpu:v1': False, u'beam:option:app_name:v1': None, u'beam:option:options_id:v1': 1, u'beam:option:no_auth:v1': False, u'beam:option:streaming:v1': False, u'beam:option:experiments:v1': [u'beam_fn_api'], u'beam:option:job_name:v1': u'BeamApp-ryan-0726042817-60555634'}
[grpc-default-executor-1] INFO sdk_worker.__init__ - Creating insecure control channel.
[grpc-default-executor-1] INFO sdk_worker.__init__ - Control channel established.
[grpc-default-executor-1] INFO sdk_worker.__init__ - Initializing SDKHarness with 12 workers.
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - Beam Fn Control client connected with id 1
[grpc-default-executor-0] INFO sdk_worker.run - Got work 1
[grpc-default-executor-0] INFO sdk_worker.run - Got work 2
[grpc-default-executor-1] INFO sdk_worker.run - Got work 3
[grpc-default-executor-1] INFO sdk_worker.create_state_handler - Creating channel for host.docker.internal:58172
[grpc-default-executor-1] INFO sdk_worker.run - Got work 4
[grpc-default-executor-1] INFO data_plane.create_data_channel - Creating channel for host.docker.internal:58171
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client connected.
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/ReadSplits output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation pair_with_one output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/RemoveRandomKeys output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation split output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation read/Read/ReadSplits output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Reshuffle/RemoveRandomKeys output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/RemoveRandomKeys output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Reshuffle/RemoveRandomKeys.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/ReadSplits output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/ReadSplits.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33).
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) ba76a328d96545d7abe5a083034bbb33.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ba76a328d96545d7abe5a083034bbb33) switched from RUNNING to FINISHED.
[jobmanager-future-thread-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) switched from SCHEDULED to DEPLOYING.
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (attempt #0) to localhost
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[ConsumerSet[read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Reshuffle/RemoveRandomKeys output_tags=['out'], receivers=[ConsumerSet[read/Read/Reshuffle/RemoveRandomKeys.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/ReadSplits output_tags=['out'], receivers=[ConsumerSet[read/Read/ReadSplits.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1).
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) [DEPLOYING]
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation split output_tags=['out'], receivers=[ConsumerSet[split.out0, coder=WindowedValueCoder[StrUtf8Coder], len(consumers)=1]]>
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) [DEPLOYING].
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1).
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) switched from DEPLOYING to RUNNING.
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation pair_with_one output_tags=['out'], receivers=[ConsumerSet[pair_with_one.out0, coder=WindowedValueCoder[TupleCoder[StrUtf8Coder, VarIntCoder]], len(consumers)=1]]>
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) exceeded the 80 characters length limit and was truncated.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) [DEPLOYING]
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) [DEPLOYING].
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) exceeded the 80 characters length limit and was truncated.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1).
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) [DEPLOYING].
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) [DEPLOYING].
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) switched from DEPLOYING to RUNNING.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) exceeded the 80 characters length limit and was truncated.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) switched from DEPLOYING to RUNNING.
[grpc-default-executor-0] INFO sdk_worker.run - Got work 5
[grpc-default-executor-0] INFO sdk_worker.run - Got work 6
[jobmanager-future-thread-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying GroupReduce (GroupReduce at group) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task GroupReduce (GroupReduce at group) (1/1).
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) [DEPLOYING]
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) [DEPLOYING].
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) [DEPLOYING].
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/InitializeWrite output_tags=['out']>
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) switched from DEPLOYING to RUNNING.
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/InitializeWrite output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/InitializeWrite.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68).
[grpc-default-executor-1] INFO sdk_worker.run - No more requests from control plane
[grpc-default-executor-1] INFO sdk_worker.run - SDK Harness waiting for in-flight requests to complete
[grpc-default-executor-1] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[grpc-default-executor-1] INFO data_plane.close - Closing all cached grpc data channels.
[grpc-default-executor-1] INFO sdk_worker.close - Closing all cached gRPC state handlers.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) 2d55c15aa0e8ca1b8d4063a1005f7c68.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:1/out.0) (1/1) (2d55c15aa0e8ca1b8d4063a1005f7c68) switched from RUNNING to FINISHED.
[jobmanager-future-thread-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) switched from CREATED to SCHEDULED.
[grpc-default-executor-1] INFO sdk_worker.run - Done consuming work.
[grpc-default-executor-1] INFO sdk_worker_main.main - Python sdk harness exiting.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) switched from SCHEDULED to DEPLOYING.
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (attempt #0) to localhost
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b).
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1).
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) f2c6d7c2ecf4fed0c19edaaca110bc6b.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (f2c6d7c2ecf4fed0c19edaaca110bc6b) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) switched from DEPLOYING to RUNNING.
[jobmanager-future-thread-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1).
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b).
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at group) 6c79e804dd96fc74b2703f03d7fdc06b.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at group) (1/1) (6c79e804dd96fc74b2703f03d7fdc06b) switched from RUNNING to FINISHED.
[Finalizer] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[Finalizer] WARN org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory - Error cleaning up environment url: "21c4c2883172"
java.lang.IllegalStateException: call already closed
at org.apache.beam.vendor.guava.v20.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl.close(ServerCallImpl.java:172)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onCompleted(ServerCalls.java:358)
at org.apache.beam.runners.fnexecution.state.GrpcStateService.close(GrpcStateService.java:54)
at org.apache.beam.runners.fnexecution.GrpcFnServer.close(GrpcFnServer.java:83)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.close(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.lambda$createEnvironmentCache$0(DockerJobBundleFactory.java:163)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1963)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3562)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3538)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3309)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.clear(LocalCache.java:4322)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4937)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.close(DockerJobBundleFactory.java:201)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.finalize(BatchFlinkExecutableStageContext.java:73)
at java.lang.System$2.invokeFinalize(System.java:1270)
at java.lang.ref.Finalizer.runFinalizer(Finalizer.java:98)
at java.lang.ref.Finalizer.access$100(Finalizer.java:34)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:210)
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull docker image 21c4c2883172
java.io.IOException: Received exit code 1 for command 'docker pull 21c4c2883172'. stderr: Error response from daemon: pull access denied for 21c4c2883172, repository does not exist or may require 'docker login'
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:150)
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:77)
at org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:147)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:174)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:170)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4992)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.forStage(DockerJobBundleFactory.java:183)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.getStageBundleFactory(BatchFlinkExecutableStageContext.java:55)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.open(FlinkExecutableStageFunction.java:96)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:494)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST -> 1 artifacts
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetArtifact name: "pickled_main_session"
retrieval_token: "/tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/MANIFEST"
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Artifact pickled_main_session located in /tmp/flink-artifacts/job_6cec63d2-dd99-40dd-a0cf-086adbf9b33d/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Beam Fn Logging client connected.
[grpc-default-executor-0] INFO sdk_worker_main.main - Logging handler created.
[grpc-default-executor-0] INFO sdk_worker_main.main - semi_persistent_directory: /tmp
[grpc-default-executor-0] INFO sdk_worker_main.start - Status HTTP server running at localhost:45973
[grpc-default-executor-0] INFO sdk_worker_main.main - Python sdk harness started with pipeline_options: {u'beam:option:dry_run:v1': False, u'beam:option:harness_docker_image:v1': u'21c4c2883172', u'beam:option:pipeline_type_check:v1': True, u'beam:option:job_endpoint:v1': u'localhost:8099', u'beam:option:dataflow_endpoint:v1': u'https://dataflow.googleapis.com', u'beam:option:runner:v1': None, u'beam:option:sdk_location:v1': u'container', u'beam:option:direct_runner_use_stacked_bundle:v1': True, u'beam:option:runtime_type_check:v1': False, u'beam:option:flink_master:v1': u'[auto]', u'beam:option:save_main_session:v1': True, u'beam:option:type_check_strictness:v1': u'DEFAULT_TO_ANY', u'beam:option:region:v1': u'us-central1', u'beam:option:profile_memory:v1': False, u'beam:option:profile_cpu:v1': False, u'beam:option:app_name:v1': None, u'beam:option:options_id:v1': 1, u'beam:option:no_auth:v1': False, u'beam:option:streaming:v1': False, u'beam:option:experiments:v1': [u'beam_fn_api'], u'beam:option:job_name:v1': u'BeamApp-ryan-0726042817-60555634'}
[grpc-default-executor-0] INFO sdk_worker.__init__ - Creating insecure control channel.
[grpc-default-executor-0] INFO sdk_worker.__init__ - Control channel established.
[grpc-default-executor-0] INFO sdk_worker.__init__ - Initializing SDKHarness with 12 workers.
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - Beam Fn Control client connected with id 1
[grpc-default-executor-0] INFO sdk_worker.run - Got work 1
[grpc-default-executor-0] INFO sdk_worker.run - Got work 2
[grpc-default-executor-0] INFO sdk_worker.run - Got work 3
[grpc-default-executor-0] INFO sdk_worker.create_state_handler - Creating channel for host.docker.internal:58193
[grpc-default-executor-0] INFO sdk_worker.run - Got work 4
[grpc-default-executor-0] INFO data_plane.create_data_channel - Creating channel for host.docker.internal:58192
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client connected.
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation format output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation count output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/WindowInto(WindowIntoFn) output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/Pair output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[StrUtf8Coder], IterableCoder[VarIntCoder]]], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[StrUtf8Coder], IterableCoder[VarIntCoder]]], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/WriteBundles output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DoOperation count output_tags=['out'], receivers=[ConsumerSet[count.out0, coder=WindowedValueCoder[TupleCoder[StrUtf8Coder, FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DoOperation format output_tags=['out'], receivers=[ConsumerSet[format.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15).
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) [FINISHED]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) 1df7541538dafee882b539a746474c15.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (1df7541538dafee882b539a746474c15) switched from RUNNING to FINISHED.
[grpc-default-executor-1] ERROR sdk_worker._execute - Error processing instruction 4. Original traceback is
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
[CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) switched from RUNNING to FAILED.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a).
[CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) [FAILED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FAILED to JobManager for task CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) 793aab96f981b099b01418f28647749a.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 6format.None/21c4c2883172:0) -> FlatMap (FlatMap at 6format.None/21c4c2883172:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (793aab96f981b099b01418f28647749a) switched from RUNNING to FAILED.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56) switched from state RUNNING to FAILING.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 1b5045002fde9a4886f8a491bb7cbb92.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 7f737caa370dd3d122ed89a3b46e8dcc.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution b813274af2a5a4987a3111d048af35cc.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 72ce29bca0224fe136bbaf6af682e95a.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution ba76a328d96545d7abe5a083034bbb33.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (d0ee330246953236926b9e386498790e) switched from CREATED to CANCELED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 2d55c15aa0e8ca1b8d4063a1005f7c68.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 99ed9d40481e0ca51c6c0b377743a520.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 1f9f51fffd00dd572ef416b9e1990c4a.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution f2c6d7c2ecf4fed0c19edaaca110bc6b.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 6c79e804dd96fc74b2703f03d7fdc06b.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 1df7541538dafee882b539a746474c15.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/21c4c2883172:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/21c4c2883172:0/out.0) (1/1) (ee91e164b4597610f8357190d3bfdd7e) switched from CREATED to CANCELED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 793aab96f981b099b01418f28647749a.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) switched from RUNNING to CANCELING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) switched from RUNNING to CANCELING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task code CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) switched from RUNNING to CANCELING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@62a118b9) (1/1) (0e9e6e247ca3ccd800390cb4d336e5f4) switched from CREATED to CANCELED.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) switched from CANCELING to CANCELED.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c).
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) [CANCELED]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskmanager.Task - Attempting to cancel task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) switched from RUNNING to CANCELING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskmanager.Task - Triggering cancellation of task code MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a).
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) switched from CANCELING to CANCELED.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a).
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state CANCELED to JobManager for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) ca8587099becc002dd08ca77e6eec98c.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) [CANCELED]
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state CANCELED to JobManager for task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) 8209a5f910e4c2f363c2a3ab73a1f74a.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:0/out.0) (1/1) (ca8587099becc002dd08ca77e6eec98c) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/21c4c2883172:2) (1/1) (8209a5f910e4c2f363c2a3ab73a1f74a) switched from CANCELING to CANCELED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Try to restart or fail the job BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56) if no longer possible.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56) switched from state FAILING to FAILED.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Could not restart the job BeamApp-ryan-0726042817-60555634 (5aaef94e469fb53caa5e501440cffd56) because the restart strategy prevented it.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job 5aaef94e469fb53caa5e501440cffd56 reached globally terminal state FAILED.
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-server] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[grpc-default-executor-0] INFO sdk_worker.run - No more requests from control plane
[grpc-default-executor-0] INFO sdk_worker.run - SDK Harness waiting for in-flight requests to complete
[grpc-default-executor-0] INFO data_plane.close - Closing all cached grpc data channels.
[grpc-default-executor-0] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[grpc-default-executor-0] INFO sdk_worker.close - Closing all cached gRPC state handlers.
[grpc-default-executor-0] INFO sdk_worker.run - Done consuming work.
[grpc-default-executor-0] INFO sdk_worker_main.main - Python sdk harness exiting.
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-ryan-0726042817-60555634(5aaef94e469fb53caa5e501440cffd56).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 222088da9b64e65315cc89bced5468ed: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher2d71dc22-1c36-44d2-a660-b2a8548810af.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher2d71dc22-1c36-44d2-a660-b2a8548810af.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager removed spill file directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/flink-io-6456b2e2-a532-4faa-89e5-6fe8067f0831
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager 98f27bde8b1b664da5f2986aea324ea8@akka://flink/user/jobmanager_1 for job 5aaef94e469fb53caa5e501440cffd56 from the resource manager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job 5aaef94e469fb53caa5e501440cffd56 with leader id 98f27bde8b1b664da5f2986aea324ea8 lost leadership.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher2d71dc22-1c36-44d2-a660-b2a8548810af.
[ForkJoinPool.commonPool-worker-3] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/flink-web-ui
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Unregister TaskManager d9d294ec25da1a7d8553c62eb1c8f864 from the SlotManager.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:58132
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-server] ERROR org.apache.beam.runners.flink.FlinkJobInvocation - Error during job invocation BeamApp-ryan-0726042817-60555634_66b45173-cafa-4e0b-92e7-60e753d9eb00.
org.apache.flink.runtime.client.JobExecutionException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:625)
at org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:234)
at org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at org.apache.beam.runners.flink.FlinkJobInvocation.runPipeline(FlinkJobInvocation.java:116)
at org.apache.beam.repackaged.beam_runners_flink_2.11.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:111)
at org.apache.beam.repackaged.beam_runners_flink_2.11.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:58)
at org.apache.beam.repackaged.beam_runners_flink_2.11.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:75)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 4: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 525, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
windowed_value, self.process_method(*args_for_process))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1032, in process
self.writer = self.sink.open_writer(init_result, str(uuid.uuid4()))
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 184, in open_writer
return FileBasedSinkWriter(self, os.path.join(init_result, uid) + suffix)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 384, in __init__
self.temp_handle = self.sink.open(temp_shard_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/textio.py", line 382, in open
file_handle = super(_TextSink, self).open(temp_path)
File "/usr/local/lib/python2.7/site-packages/apache_beam/options/value_provider.py", line 133, in _f
return fnc(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 128, in open
return FileSystems.create(temp_path, self.mime_type, self.compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filesystems.py", line 187, in create
return filesystem.create(path, mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 147, in create
return self._path_open(path, 'wb', mime_type, compression_type)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/localfilesystem.py", line 130, in _path_open
raw_file = open(path, mode)
RuntimeError: IOError: [Errno 2] No such file or directory: '/tmp/beam-temp-py-wordcount-direct-55b0feda908c11e88deb025000000001/3f84b72b-c2ff-43f1-9479-65465779d7b4.py-wordcount-direct' [while running 'write/Write/WriteImpl/WriteBundles']
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[Finalizer] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[Finalizer] WARN org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory - Error cleaning up environment url: "21c4c2883172"
java.lang.IllegalStateException: call already closed
at org.apache.beam.vendor.guava.v20.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl.close(ServerCallImpl.java:172)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onCompleted(ServerCalls.java:358)
at org.apache.beam.runners.fnexecution.state.GrpcStateService.close(GrpcStateService.java:54)
at org.apache.beam.runners.fnexecution.GrpcFnServer.close(GrpcFnServer.java:83)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.close(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.lambda$createEnvironmentCache$0(DockerJobBundleFactory.java:163)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1963)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3562)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3538)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3309)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.clear(LocalCache.java:4322)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4937)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.close(DockerJobBundleFactory.java:201)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.finalize(BatchFlinkExecutableStageContext.java:73)
at java.lang.System$2.invokeFinalize(System.java:1270)
at java.lang.ref.Finalizer.runFinalizer(Finalizer.java:98)
at java.lang.ref.Finalizer.access$100(Finalizer.java:34)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:210)
<============-> 98% EXECUTING [29s]
> :beam-runners-flink_2.11-job-server:runShadow
> IDLE
^C Thu 04:28:31 ryan@mbp: beam:staging$ ./gradlew :beam-runners-flink_2.11-job-server:runShadow
Parallel execution is an incubating feature.
Parallel execution with configuration on demand is an incubating feature.
> Configure project :beam-model-pipeline
applyPortabilityNature with default configuration for project beam-model-pipeline
> Configure project :beam-model-fn-execution
applyPortabilityNature with default configuration for project beam-model-fn-execution
> Configure project :beam-model-job-management
applyPortabilityNature with default configuration for project beam-model-job-management
> Task :beam-runners-flink_2.11-job-server:runShadow
Listening for transport dt_socket at address: 5005
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - stored metadata: staging_session_token: "{\"sessionId\":\"job_aec90381-b818-4f85-94cd-9c44e0a1b3ef\",\"basePath\":\"/tmp/flink-artifacts\"}"
metadata {
name: "pickled_main_session"
md5: "UoIk3316DjrZqF3HP8dyBg=="
}
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Going to stage artifact pickled_main_session to /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c.
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService - Staging artifact completed for /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c
[grpc-default-executor-0] INFO org.apache.beam.runners.flink.FlinkJobInvoker - Invoking job BeamApp-ryan-0726042851-cb36f0c0_53d0009c-40d9-4611-89b2-528ab81eb15c
[grpc-default-executor-0] INFO org.apache.beam.runners.flink.FlinkJobInvocation - Starting job invocation BeamApp-ryan-0726042851-cb36f0c0_53d0009c-40d9-4611-89b2-528ab81eb15c
[flink-runner-job-server] INFO org.apache.beam.runners.flink.FlinkJobInvocation - Translating pipeline to Flink program.
[grpc-default-executor-0] WARN org.apache.beam.runners.flink.FlinkJobInvocation - addMessageObserver() not yet implemented.
[flink-runner-job-server] INFO org.apache.beam.runners.flink.FlinkExecutionEnvironments - Creating a Batch Execution Environment.
[flink-runner-job-server] INFO org.apache.flink.api.java.ExecutionEnvironment - The job has 0 registered types and 0 default Kryo serializers
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting Flink Mini Cluster
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting Metrics Registry
[flink-runner-job-server] INFO org.apache.flink.runtime.metrics.MetricRegistryImpl - No metrics reporter configured, no metrics will be exposed/reported.
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting RPC Service(s)
[flink-akka.actor.default-dispatcher-4] INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting high-availability services
[flink-runner-job-server] INFO org.apache.flink.runtime.blob.BlobServer - Created BLOB server storage directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/blobStore-2fc8b392-6721-40f2-b14c-6034a0afae58
[flink-runner-job-server] INFO org.apache.flink.runtime.blob.BlobServer - Started BLOB server at 0.0.0.0:58215 - max concurrent requests: 50 - max backlog: 1000
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting ResourceManger
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.resourcemanager.StandaloneResourceManager at akka://flink/user/resourcemanager_784790a1-7b1a-43b7-9733-659adb05bdfb .
[flink-runner-job-server] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Proposing leadership to contender org.apache.flink.runtime.resourcemanager.StandaloneResourceManager@45100c5 @ akka://flink/user/resourcemanager_784790a1-7b1a-43b7-9733-659adb05bdfb
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - ResourceManager akka://flink/user/resourcemanager_784790a1-7b1a-43b7-9733-659adb05bdfb was granted leadership with fencing token b3b39bf760308bd466292e7cdd064dad
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Starting the SlotManager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Received confirmation of leadership for leader akka://flink/user/resourcemanager_784790a1-7b1a-43b7-9733-659adb05bdfb , session=66292e7c-dd06-4dad-b3b3-9bf760308bd4
[flink-runner-job-server] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Created BLOB cache storage directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/blobStore-bdec9e0d-db60-4cfe-947f-8d19bc66fcda
[flink-runner-job-server] INFO org.apache.flink.runtime.blob.TransientBlobCache - Created BLOB cache storage directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/blobStore-0098f7c0-e6d0-4bc1-b642-39df20aa5b0c
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting 1 TaskManger(s)
[flink-runner-job-server] INFO org.apache.flink.runtime.taskexecutor.TaskManagerServices - Temporary file directory '/var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T': total 931 GB, usable 422 GB (45.33% usable)
[flink-runner-job-server] INFO org.apache.flink.runtime.io.network.buffer.NetworkBufferPool - Allocated 404 MB for network buffer pool (number of memory segments: 12945, bytes per segment: 32768).
[flink-runner-job-server] INFO org.apache.flink.runtime.query.QueryableStateUtils - Could not load Queryable State Client Proxy. Probable reason: flink-queryable-state-runtime is not in the classpath. To enable Queryable State, please move the flink-queryable-state-runtime jar from the opt to the lib folder.
[flink-runner-job-server] INFO org.apache.flink.runtime.query.QueryableStateUtils - Could not load Queryable State Server. Probable reason: flink-queryable-state-runtime is not in the classpath. To enable Queryable State, please move the flink-queryable-state-runtime jar from the opt to the lib folder.
[flink-runner-job-server] INFO org.apache.flink.runtime.io.network.NetworkEnvironment - Starting the network environment and its components.
[flink-runner-job-server] INFO org.apache.flink.runtime.taskexecutor.TaskManagerServices - Limiting managed memory to 0.7 of the currently free heap space (2537 MB), memory will be allocated lazily.
[flink-runner-job-server] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager uses directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/flink-io-15e5a681-f1fa-41e5-a008-1be13cbdd91e for spill files.
[flink-runner-job-server] INFO org.apache.flink.runtime.filecache.FileCache - User file cache uses directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/flink-dist-cache-0d8e62b9-bedb-465c-bacf-34fd72871440
[flink-runner-job-server] INFO org.apache.flink.runtime.taskexecutor.TaskManagerConfiguration - Messages have a max timeout of 10000 ms
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.taskexecutor.TaskExecutor at akka://flink/user/taskmanager_0 .
[flink-runner-job-server] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Start job leader service.
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting dispatcher rest endpoint.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Connecting to ResourceManager akka://flink/user/resourcemanager_784790a1-7b1a-43b7-9733-659adb05bdfb(b3b39bf760308bd466292e7cdd064dad).
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Resolved ResourceManager address, beginning registration
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Registration at ResourceManager attempt 1 (timeout=100ms)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Successful registration at resource manager akka://flink/user/resourcemanager_784790a1-7b1a-43b7-9733-659adb05bdfb under registration id 34838cf13f97a5e7037230e6b59e574f.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Registering TaskManager a3066d66-161e-4170-ad06-28c0788e5f40 under 34838cf13f97a5e7037230e6b59e574f at the SlotManager.
[flink-runner-job-server] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Starting rest endpoint.
[flink-runner-job-server] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - Log file environment variable 'log.file' is not set.
[flink-runner-job-server] WARN org.apache.flink.runtime.webmonitor.WebMonitorUtils - JobManager log files are unavailable in the web dashboard. Log file location not found in environment variable 'log.file' or configuration key 'Key: 'web.log.path' , default: null (deprecated keys: [jobmanager.web.log.path])'.
[flink-runner-job-server] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Failed to load web based job submission extension. Probable reason: flink-runtime-web is not in the classpath.
[flink-runner-job-server] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Rest endpoint listening at localhost:58216
[flink-runner-job-server] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Proposing leadership to contender org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint@2ec001f5 @ http://localhost:58216
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - http://localhost:58216 was granted leadership with leaderSessionID=1aa128a7-0a45-464f-8903-cc1d20d0bcf9
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Starting job dispatcher(s) for JobManger
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Received confirmation of leadership for leader http://localhost:58216 , session=1aa128a7-0a45-464f-8903-cc1d20d0bcf9
[flink-runner-job-server] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.dispatcher.StandaloneDispatcher at akka://flink/user/dispatcher327263a8-9047-41de-b5ea-a1ebcda651b7 .
[flink-runner-job-server] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Proposing leadership to contender org.apache.flink.runtime.dispatcher.StandaloneDispatcher@49cad39 @ akka://flink/user/dispatcher327263a8-9047-41de-b5ea-a1ebcda651b7
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Flink Mini Cluster started successfully
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Dispatcher akka://flink/user/dispatcher327263a8-9047-41de-b5ea-a1ebcda651b7 was granted leadership with fencing token ad305efabd8408346575b50251224da7
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Recovering all persisted jobs.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Received confirmation of leadership for leader akka://flink/user/dispatcher327263a8-9047-41de-b5ea-a1ebcda651b7 , session=6575b502-5122-4da7-ad30-5efabd840834
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Submitting job d865612786428082b713c0c3ddb6338d (BeamApp-ryan-0726042851-cb36f0c0).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.jobmaster.JobMaster at akka://flink/user/jobmanager_1 .
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.JobMaster - Initializing job BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.JobMaster - Using restart strategy NoRestartStrategy for BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Starting RPC endpoint for org.apache.flink.runtime.jobmaster.slotpool.SlotPool at akka://flink/user/76ded675-ba5b-4271-94ab-0a1965204a97 .
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job recovers via failover strategy: full graph restart
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.JobMaster - Running initialization on master for job BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.JobMaster - Successfully ran initialization on master in 1 ms.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Proposing leadership to contender org.apache.flink.runtime.jobmaster.JobManagerRunner@7e82cef2 @ akka://flink/user/jobmanager_1
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.JobManagerRunner - JobManager runner for job BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d) was granted leadership with session id 77f79eaf-d5e6-463f-9a43-b4bdc5c1fc66 at akka://flink/user/jobmanager_1.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Starting execution of job BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d) switched from state CREATED to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-1] INFO org.apache.flink.runtime.highavailability.nonha.embedded.EmbeddedLeaderService - Received confirmation of leadership for leader akka://flink/user/jobmanager_1 , session=77f79eaf-d5e6-463f-9a43-b4bdc5c1fc66
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Connecting to ResourceManager akka://flink/user/resourcemanager_784790a1-7b1a-43b7-9733-659adb05bdfb(b3b39bf760308bd466292e7cdd064dad)
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Resolved ResourceManager address, beginning registration
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Registration at ResourceManager attempt 1 (timeout=100ms)
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Registering job manager 9a43b4bdc5c1fc6677f79eafd5e6463f@akka://flink/user/jobmanager_1 for job d865612786428082b713c0c3ddb6338d.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Cannot serve slot request, no ResourceManager connected. Adding as pending request 9b16a1836536152c43995920c433f63b
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Registered job manager 9a43b4bdc5c1fc6677f79eafd5e6463f@akka://flink/user/jobmanager_1 for job d865612786428082b713c0c3ddb6338d.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.JobMaster - JobManager successfully registered at ResourceManager, leader id: b3b39bf760308bd466292e7cdd064dad.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Requesting slot with profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} from resource manager (request = 9b16a1836536152c43995920c433f63b).
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Request slot with profile ResourceProfile{cpuCores=-1.0, heapMemoryInMB=-1, directMemoryInMB=0, nativeMemoryInMB=0, networkMemoryInMB=0} for job d865612786428082b713c0c3ddb6338d with allocation id c85d2a2a93a854f4d508aa05e74170a6.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Receive slot request c85d2a2a93a854f4d508aa05e74170a6 for job d865612786428082b713c0c3ddb6338d from resource manager with leader id b3b39bf760308bd466292e7cdd064dad.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Allocated slot for c85d2a2a93a854f4d508aa05e74170a6.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Add job d865612786428082b713c0c3ddb6338d for job leader monitoring.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Try to register at job manager akka://flink/user/jobmanager_1 with leader id 77f79eaf-d5e6-463f-9a43-b4bdc5c1fc66.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Resolved JobManager address, beginning registration
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Registration at JobManager attempt 1 (timeout=100ms)
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Successful registration at job manager akka://flink/user/jobmanager_1 for job d865612786428082b713c0c3ddb6338d.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Establish JobManager connection for job d865612786428082b713c0c3ddb6338d.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Offer reserved slots to the leader of job d865612786428082b713c0c3ddb6338d.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.slot.TaskSlotTable - Activate slot c85d2a2a93a854f4d508aa05e74170a6.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1).
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) switched from CREATED to DEPLOYING.
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) [DEPLOYING]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1).
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) switched from CREATED to DEPLOYING.
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) [DEPLOYING]
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) [DEPLOYING].
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) [DEPLOYING].
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) [DEPLOYING].
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) [DEPLOYING].
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) switched from DEPLOYING to RUNNING.
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) switched from DEPLOYING to RUNNING.
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) exceeded the 80 characters length limit and was truncated.
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) exceeded the 80 characters length limit and was truncated.
[jobmanager-future-thread-1] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (attempt #0) to localhost
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) switched from RUNNING to FINISHED.
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) switched from RUNNING to FINISHED.
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209).
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (attempt #0) to localhost
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602).
[DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) [FINISHED]
[DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) 694ea8bb3735c96cc9b30dbb931a8602.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) 7b7df7e953db2fdd24f1f17880c93209.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at read/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (694ea8bb3735c96cc9b30dbb931a8602) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-4] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSource (at write/Write/WriteImpl/DoOnce/Read/Impulse (org.apache.beam.runners.flink.translation.wrappers.ImpulseInputFormat)) (1/1) (7b7df7e953db2fdd24f1f17880c93209) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1).
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1).
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull docker image b37d84d99d38
java.io.IOException: Received exit code 1 for command 'docker pull b37d84d99d38'. stderr: Error response from daemon: pull access denied for b37d84d99d38, repository does not exist or may require 'docker login'
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:150)
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:77)
at org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:147)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:174)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:170)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4992)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.forStage(DockerJobBundleFactory.java:183)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.getStageBundleFactory(BatchFlinkExecutableStageContext.java:55)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.open(FlinkExecutableStageFunction.java:96)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:494)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Loading manifest for retrieval token /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Manifest at /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST has 1 artifact locations
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST -> 1 artifacts
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetArtifact name: "pickled_main_session"
retrieval_token: "/tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST"
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Artifact pickled_main_session located in /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Beam Fn Logging client connected.
[grpc-default-executor-0] INFO sdk_worker_main.main - Logging handler created.
[grpc-default-executor-0] INFO sdk_worker_main.main - semi_persistent_directory: /tmp
[grpc-default-executor-0] INFO sdk_worker_main.start - Status HTTP server running at localhost:45941
[grpc-default-executor-0] INFO sdk_worker_main.main - Python sdk harness started with pipeline_options: {u'beam:option:dry_run:v1': False, u'beam:option:harness_docker_image:v1': u'b37d84d99d38', u'beam:option:pipeline_type_check:v1': True, u'beam:option:job_endpoint:v1': u'localhost:8099', u'beam:option:dataflow_endpoint:v1': u'https://dataflow.googleapis.com', u'beam:option:runner:v1': None, u'beam:option:sdk_location:v1': u'container', u'beam:option:direct_runner_use_stacked_bundle:v1': True, u'beam:option:runtime_type_check:v1': False, u'beam:option:flink_master:v1': u'[auto]', u'beam:option:save_main_session:v1': True, u'beam:option:type_check_strictness:v1': u'DEFAULT_TO_ANY', u'beam:option:region:v1': u'us-central1', u'beam:option:profile_memory:v1': False, u'beam:option:profile_cpu:v1': False, u'beam:option:app_name:v1': None, u'beam:option:options_id:v1': 1, u'beam:option:no_auth:v1': False, u'beam:option:streaming:v1': False, u'beam:option:experiments:v1': [u'beam_fn_api'], u'beam:option:job_name:v1': u'BeamApp-ryan-0726042851-cb36f0c0'}
[grpc-default-executor-0] INFO sdk_worker.__init__ - Creating insecure control channel.
[grpc-default-executor-0] INFO sdk_worker.__init__ - Control channel established.
[grpc-default-executor-0] INFO sdk_worker.__init__ - Initializing SDKHarness with 12 workers.
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - Beam Fn Control client connected with id 1
[grpc-default-executor-0] INFO sdk_worker.run - Got work 2
[grpc-default-executor-1] INFO sdk_worker.run - Got work 1
[grpc-default-executor-1] INFO sdk_worker.run - Got work 3
[grpc-default-executor-1] INFO sdk_worker.create_state_handler - Creating channel for host.docker.internal:58235
[grpc-default-executor-1] INFO sdk_worker.run - Got work 4
[grpc-default-executor-1] INFO data_plane.create_data_channel - Creating channel for host.docker.internal:58234
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client connected.
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/AddRandomKeys output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Split output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Reshuffle/AddRandomKeys output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Split output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Split output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Split.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/AddRandomKeys output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Reshuffle/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Split output_tags=['out'], receivers=[ConsumerSet[read/Read/Split.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Reshuffle/AddRandomKeys output_tags=['out'], receivers=[ConsumerSet[read/Read/Reshuffle/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[ConsumerSet[read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[jobmanager-future-thread-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1).
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) [DEPLOYING]
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) [DEPLOYING].
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) [DEPLOYING].
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1).
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) [DEPLOYING]
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) [DEPLOYING].
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec).
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) [DEPLOYING].
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348).
[CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) 185d814b196ed19e3c6c77928e468bec.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 17read/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 17read/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: read/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (185d814b196ed19e3c6c77928e468bec) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) 4c995c29bff1bde56929b01e36a02348.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0) -> FlatMap (FlatMap at 41write/Write/WriteImpl/DoOnce/Read/Impulse.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) -> Map (Key Extractor) (1/1) (4c995c29bff1bde56929b01e36a02348) switched from RUNNING to FINISHED.
[grpc-default-executor-0] INFO sdk_worker.run - No more requests from control plane
[grpc-default-executor-0] INFO sdk_worker.run - SDK Harness waiting for in-flight requests to complete
[jobmanager-future-thread-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (attempt #0) to localhost
[jobmanager-future-thread-1] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (attempt #0) to localhost
[grpc-default-executor-0] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9).
[GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1).
[grpc-default-executor-2] INFO data_plane.close - Closing all cached grpc data channels.
[grpc-default-executor-2] INFO sdk_worker.close - Closing all cached gRPC state handlers.
[grpc-default-executor-2] INFO sdk_worker.run - Done consuming work.
[grpc-default-executor-2] INFO sdk_worker_main.main - Python sdk harness exiting.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) [DEPLOYING].
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1).
[grpc-default-executor-0] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) c1724171f667445b88042ad0e528bec9.
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) [DEPLOYING].
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at read/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (c1724171f667445b88042ad0e528bec9) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) [DEPLOYING].
[GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) edba53a6ec05da0c4967301743cfb4c0.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) exceeded the 80 characters length limit and was truncated.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey) (1/1) (edba53a6ec05da0c4967301743cfb4c0) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) exceeded the 80 characters length limit and was truncated.
[Finalizer] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[Finalizer] WARN org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory - Error cleaning up environment url: "b37d84d99d38"
java.lang.IllegalStateException: call already closed
at org.apache.beam.vendor.guava.v20.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl.close(ServerCallImpl.java:172)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onCompleted(ServerCalls.java:358)
at org.apache.beam.runners.fnexecution.state.GrpcStateService.close(GrpcStateService.java:54)
at org.apache.beam.runners.fnexecution.GrpcFnServer.close(GrpcFnServer.java:83)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.close(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.lambda$createEnvironmentCache$0(DockerJobBundleFactory.java:163)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1963)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3562)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3538)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3309)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.clear(LocalCache.java:4322)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4937)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.close(DockerJobBundleFactory.java:201)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.finalize(BatchFlinkExecutableStageContext.java:73)
at java.lang.System$2.invokeFinalize(System.java:1270)
at java.lang.ref.Finalizer.runFinalizer(Finalizer.java:98)
at java.lang.ref.Finalizer.access$100(Finalizer.java:34)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:210)
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull docker image b37d84d99d38
java.io.IOException: Received exit code 1 for command 'docker pull b37d84d99d38'. stderr: Error response from daemon: pull access denied for b37d84d99d38, repository does not exist or may require 'docker login'
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:150)
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:77)
at org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:147)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:174)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:170)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4992)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.forStage(DockerJobBundleFactory.java:183)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.getStageBundleFactory(BatchFlinkExecutableStageContext.java:55)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.open(FlinkExecutableStageFunction.java:96)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:494)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST -> 1 artifacts
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetArtifact name: "pickled_main_session"
retrieval_token: "/tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST"
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Artifact pickled_main_session located in /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Beam Fn Logging client connected.
[grpc-default-executor-2] INFO sdk_worker_main.main - Logging handler created.
[grpc-default-executor-2] INFO sdk_worker_main.main - semi_persistent_directory: /tmp
[grpc-default-executor-2] INFO sdk_worker_main.start - Status HTTP server running at localhost:39985
[grpc-default-executor-2] INFO sdk_worker_main.main - Python sdk harness started with pipeline_options: {u'beam:option:dry_run:v1': False, u'beam:option:harness_docker_image:v1': u'b37d84d99d38', u'beam:option:pipeline_type_check:v1': True, u'beam:option:job_endpoint:v1': u'localhost:8099', u'beam:option:dataflow_endpoint:v1': u'https://dataflow.googleapis.com', u'beam:option:runner:v1': None, u'beam:option:sdk_location:v1': u'container', u'beam:option:direct_runner_use_stacked_bundle:v1': True, u'beam:option:runtime_type_check:v1': False, u'beam:option:flink_master:v1': u'[auto]', u'beam:option:save_main_session:v1': True, u'beam:option:type_check_strictness:v1': u'DEFAULT_TO_ANY', u'beam:option:region:v1': u'us-central1', u'beam:option:profile_memory:v1': False, u'beam:option:profile_cpu:v1': False, u'beam:option:app_name:v1': None, u'beam:option:options_id:v1': 1, u'beam:option:no_auth:v1': False, u'beam:option:streaming:v1': False, u'beam:option:experiments:v1': [u'beam_fn_api'], u'beam:option:job_name:v1': u'BeamApp-ryan-0726042851-cb36f0c0'}
[grpc-default-executor-2] INFO sdk_worker.__init__ - Creating insecure control channel.
[grpc-default-executor-2] INFO sdk_worker.__init__ - Control channel established.
[grpc-default-executor-2] INFO sdk_worker.__init__ - Initializing SDKHarness with 12 workers.
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - Beam Fn Control client connected with id 1
[grpc-default-executor-2] INFO sdk_worker.run - Got work 1
[grpc-default-executor-2] INFO sdk_worker.run - Got work 2
[grpc-default-executor-2] INFO sdk_worker.run - Got work 3
[grpc-default-executor-2] INFO sdk_worker.create_state_handler - Creating channel for host.docker.internal:58256
[grpc-default-executor-2] INFO sdk_worker.run - Got work 4
[grpc-default-executor-2] INFO data_plane.create_data_channel - Creating channel for host.docker.internal:58255
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client connected.
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/ReadSplits output_tags=['out']>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/RemoveRandomKeys output_tags=['out']>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out']>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DoOperation pair_with_one output_tags=['out']>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DoOperation split output_tags=['out']>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/Reshuffle/RemoveRandomKeys output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/Reshuffle/RemoveRandomKeys.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DoOperation read/Read/ReadSplits output_tags=['out']>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/DoOnce/Read/ReadSplits output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/DoOnce/Read/ReadSplits.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Reshuffle/RemoveRandomKeys output_tags=['out']>
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DoOperation read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out']>
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3).
[CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) 5ac4c56d95ef7ebbd120f5a2320eb5c3.
[grpc-default-executor-0] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 84write/Write/WriteImpl/DoOnce/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (5ac4c56d95ef7ebbd120f5a2320eb5c3) switched from RUNNING to FINISHED.
[jobmanager-future-thread-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) switched from CREATED to SCHEDULED.
[jobmanager-future-thread-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1).
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) [DEPLOYING].
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1).
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) [DEPLOYING].
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1).
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) exceeded the 80 characters length limit and was truncated.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) switched from CREATED to DEPLOYING.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) [DEPLOYING]
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) switched from DEPLOYING to RUNNING.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) exceeded the 80 characters length limit and was truncated.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) switched from DEPLOYING to RUNNING.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) switched from DEPLOYING to RUNNING.
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) exceeded the 80 characters length limit and was truncated.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) switched from DEPLOYING to RUNNING.
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[ConsumerSet[read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/Reshuffle/RemoveRandomKeys output_tags=['out'], receivers=[ConsumerSet[read/Read/Reshuffle/RemoveRandomKeys.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation read/Read/ReadSplits output_tags=['out'], receivers=[ConsumerSet[read/Read/ReadSplits.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation split output_tags=['out'], receivers=[ConsumerSet[split.out0, coder=WindowedValueCoder[StrUtf8Coder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation pair_with_one output_tags=['out'], receivers=[ConsumerSet[pair_with_one.out0, coder=WindowedValueCoder[TupleCoder[StrUtf8Coder, VarIntCoder]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[grpc-default-executor-0] INFO sdk_worker.run - Got work 5
[grpc-default-executor-0] INFO sdk_worker.run - Got work 6
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/InitializeWrite output_tags=['out']>
[jobmanager-future-thread-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying GroupReduce (GroupReduce at group) (1/1) (attempt #0) to localhost
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task GroupReduce (GroupReduce at group) (1/1).
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) [DEPLOYING]
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) [DEPLOYING].
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) [DEPLOYING].
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) switched from DEPLOYING to RUNNING.
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/InitializeWrite output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/InitializeWrite.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7).
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) 8aa53bf8e0e369ae1553dcf06266b2c7.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:2/out.0) (1/1) (8aa53bf8e0e369ae1553dcf06266b2c7) switched from RUNNING to FINISHED.
[jobmanager-future-thread-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1).
[grpc-default-executor-0] INFO sdk_worker.run - No more requests from control plane
[grpc-default-executor-0] INFO sdk_worker.run - SDK Harness waiting for in-flight requests to complete
[grpc-default-executor-0] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450).
[grpc-default-executor-1] INFO data_plane.close - Closing all cached grpc data channels.
[grpc-default-executor-1] INFO sdk_worker.close - Closing all cached gRPC state handlers.
[grpc-default-executor-1] INFO sdk_worker.run - Done consuming work.
[grpc-default-executor-1] INFO sdk_worker_main.main - Python sdk harness exiting.
[CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) [FINISHED]
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) 05b8173ddd1a42d764868cdf859c9450.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 60read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: group) -> Map (Key Extractor) (1/1) (05b8173ddd1a42d764868cdf859c9450) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) switched from DEPLOYING to RUNNING.
[jobmanager-future-thread-1] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (attempt #0) to localhost
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1).
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) switched from DEPLOYING to RUNNING.
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607).
[GroupReduce (GroupReduce at group) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at group) 708d3c4115c1ff76b384fa55fbb4a607.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at group) (1/1) (708d3c4115c1ff76b384fa55fbb4a607) switched from RUNNING to FINISHED.
[Finalizer] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[Finalizer] WARN org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory - Error cleaning up environment url: "b37d84d99d38"
java.lang.IllegalStateException: call already closed
at org.apache.beam.vendor.guava.v20.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl.close(ServerCallImpl.java:172)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onCompleted(ServerCalls.java:358)
at org.apache.beam.runners.fnexecution.state.GrpcStateService.close(GrpcStateService.java:54)
at org.apache.beam.runners.fnexecution.GrpcFnServer.close(GrpcFnServer.java:83)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.close(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.lambda$createEnvironmentCache$0(DockerJobBundleFactory.java:163)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1963)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3562)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3538)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3309)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.clear(LocalCache.java:4322)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4937)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.close(DockerJobBundleFactory.java:201)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.finalize(BatchFlinkExecutableStageContext.java:73)
at java.lang.System$2.invokeFinalize(System.java:1270)
at java.lang.ref.Finalizer.runFinalizer(Finalizer.java:98)
at java.lang.ref.Finalizer.access$100(Finalizer.java:34)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:210)
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull docker image b37d84d99d38
java.io.IOException: Received exit code 1 for command 'docker pull b37d84d99d38'. stderr: Error response from daemon: pull access denied for b37d84d99d38, repository does not exist or may require 'docker login'
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:150)
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:77)
at org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:147)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:174)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:170)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4992)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.forStage(DockerJobBundleFactory.java:183)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.getStageBundleFactory(BatchFlinkExecutableStageContext.java:55)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.open(FlinkExecutableStageFunction.java:96)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:494)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST -> 1 artifacts
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetArtifact name: "pickled_main_session"
retrieval_token: "/tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST"
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Artifact pickled_main_session located in /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Beam Fn Logging client connected.
[grpc-default-executor-2] INFO sdk_worker_main.main - Logging handler created.
[grpc-default-executor-2] INFO sdk_worker_main.main - semi_persistent_directory: /tmp
[grpc-default-executor-2] INFO sdk_worker_main.start - Status HTTP server running at localhost:35275
[grpc-default-executor-2] INFO sdk_worker_main.main - Python sdk harness started with pipeline_options: {u'beam:option:dry_run:v1': False, u'beam:option:harness_docker_image:v1': u'b37d84d99d38', u'beam:option:pipeline_type_check:v1': True, u'beam:option:job_endpoint:v1': u'localhost:8099', u'beam:option:dataflow_endpoint:v1': u'https://dataflow.googleapis.com', u'beam:option:runner:v1': None, u'beam:option:sdk_location:v1': u'container', u'beam:option:direct_runner_use_stacked_bundle:v1': True, u'beam:option:runtime_type_check:v1': False, u'beam:option:flink_master:v1': u'[auto]', u'beam:option:save_main_session:v1': True, u'beam:option:type_check_strictness:v1': u'DEFAULT_TO_ANY', u'beam:option:region:v1': u'us-central1', u'beam:option:profile_memory:v1': False, u'beam:option:profile_cpu:v1': False, u'beam:option:app_name:v1': None, u'beam:option:options_id:v1': 1, u'beam:option:no_auth:v1': False, u'beam:option:streaming:v1': False, u'beam:option:experiments:v1': [u'beam_fn_api'], u'beam:option:job_name:v1': u'BeamApp-ryan-0726042851-cb36f0c0'}
[grpc-default-executor-2] INFO sdk_worker.__init__ - Creating insecure control channel.
[grpc-default-executor-2] INFO sdk_worker.__init__ - Control channel established.
[grpc-default-executor-2] INFO sdk_worker.__init__ - Initializing SDKHarness with 12 workers.
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - Beam Fn Control client connected with id 1
[grpc-default-executor-2] INFO sdk_worker.run - Got work 1
[grpc-default-executor-2] INFO sdk_worker.run - Got work 2
[grpc-default-executor-1] INFO sdk_worker.run - Got work 3
[grpc-default-executor-1] INFO sdk_worker.run - Got work 4
[grpc-default-executor-1] INFO sdk_worker.create_state_handler - Creating channel for host.docker.internal:58278
[grpc-default-executor-1] INFO data_plane.create_data_channel - Creating channel for host.docker.internal:58277
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client connected.
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation format output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/WindowInto(WindowIntoFn) output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/Pair output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation count output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/WriteBundles output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[StrUtf8Coder], IterableCoder[VarIntCoder]]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[StrUtf8Coder], IterableCoder[VarIntCoder]]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation count output_tags=['out'], receivers=[ConsumerSet[count.out0, coder=WindowedValueCoder[TupleCoder[StrUtf8Coder, FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation format output_tags=['out'], receivers=[ConsumerSet[format.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86).
[CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) [FINISHED]
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) 31fd1d53dd169890af1b7eb111e0dc86.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 19group/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 19group/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (31fd1d53dd169890af1b7eb111e0dc86) switched from RUNNING to FINISHED.
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/WriteBundles output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/WriteBundles.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/Pair output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/Pair.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/WindowInto(WindowIntoFn) output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/WindowInto(WindowIntoFn).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[jobmanager-future-thread-7] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) switched from SCHEDULED to DEPLOYING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (attempt #0) to localhost
[CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc).
[CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1).
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) ec11ff0fcfdc22f400da4cd16a99bdfc.
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) switched from CREATED to DEPLOYING.
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) [DEPLOYING]
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) [DEPLOYING].
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) [DEPLOYING].
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 6format.None/b37d84d99d38:0) -> FlatMap (FlatMap at 6format.None/b37d84d99d38:0/out.0) -> Map (Key Extractor) -> GroupCombine (GroupCombine at GroupCombine: write/Write/WriteImpl/GroupByKey) -> Map (Key Extractor) (1/1) (ec11ff0fcfdc22f400da4cd16a99bdfc) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) switched from DEPLOYING to RUNNING.
[grpc-default-executor-1] INFO sdk_worker.run - No more requests from control plane
[grpc-default-executor-1] INFO sdk_worker.run - SDK Harness waiting for in-flight requests to complete
[grpc-default-executor-2] INFO data_plane.close - Closing all cached grpc data channels.
[grpc-default-executor-2] INFO sdk_worker.close - Closing all cached gRPC state handlers.
[grpc-default-executor-1] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[grpc-default-executor-1] INFO sdk_worker.run - Done consuming work.
[grpc-default-executor-1] INFO sdk_worker_main.main - Python sdk harness exiting.
[grpc-default-executor-1] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Logging client hanged up.
[jobmanager-future-thread-8] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) switched from CREATED to SCHEDULED.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) switched from SCHEDULED to DEPLOYING.
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) switched from RUNNING to FINISHED.
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab).
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Deploying CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (attempt #0) to localhost
[GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) [FINISHED]
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) 2606b25605378460a235c7788db70cab.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - GroupReduce (GroupReduce at write/Write/WriteImpl/GroupByKey) (1/1) (2606b25605378460a235c7788db70cab) switched from RUNNING to FINISHED.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Received task CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1).
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) switched from CREATED to DEPLOYING.
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Creating FileSystem stream leak safety net for task CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) [DEPLOYING]
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Loading JAR files for task CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Registering task at network: CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) [DEPLOYING].
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) switched from DEPLOYING to RUNNING.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) switched from DEPLOYING to RUNNING.
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) exceeded the 80 characters length limit and was truncated.
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] WARN org.apache.flink.metrics.MetricGroup - The operator name FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) exceeded the 80 characters length limit and was truncated.
[Finalizer] WARN org.apache.beam.sdk.fn.data.BeamFnDataGrpcMultiplexer - Hanged up for unknown endpoint.
[Finalizer] WARN org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory - Error cleaning up environment url: "b37d84d99d38"
java.lang.IllegalStateException: call already closed
at org.apache.beam.vendor.guava.v20.com.google.common.base.Preconditions.checkState(Preconditions.java:444)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl.close(ServerCallImpl.java:172)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onCompleted(ServerCalls.java:358)
at org.apache.beam.runners.fnexecution.state.GrpcStateService.close(GrpcStateService.java:54)
at org.apache.beam.runners.fnexecution.GrpcFnServer.close(GrpcFnServer.java:83)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.$closeResource(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$WrappedSdkHarnessClient.close(DockerJobBundleFactory.java:368)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.lambda$createEnvironmentCache$0(DockerJobBundleFactory.java:163)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.processPendingNotifications(LocalCache.java:1963)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.runUnlockedCleanup(LocalCache.java:3562)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.postWriteCleanup(LocalCache.java:3538)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.clear(LocalCache.java:3309)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.clear(LocalCache.java:4322)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalManualCache.invalidateAll(LocalCache.java:4937)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.close(DockerJobBundleFactory.java:201)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.finalize(BatchFlinkExecutableStageContext.java:73)
at java.lang.System$2.invokeFinalize(System.java:1270)
at java.lang.ref.Finalizer.runFinalizer(Finalizer.java:98)
at java.lang.ref.Finalizer.access$100(Finalizer.java:34)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:210)
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] WARN org.apache.beam.runners.fnexecution.environment.DockerCommand - Unable to pull docker image b37d84d99d38
java.io.IOException: Received exit code 1 for command 'docker pull b37d84d99d38'. stderr: Error response from daemon: pull access denied for b37d84d99d38, repository does not exist or may require 'docker login'
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runShortCommand(DockerCommand.java:150)
at org.apache.beam.runners.fnexecution.environment.DockerCommand.runImage(DockerCommand.java:77)
at org.apache.beam.runners.fnexecution.environment.DockerEnvironmentFactory.createEnvironment(DockerEnvironmentFactory.java:147)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:174)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory$2.load(DockerJobBundleFactory.java:170)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986)
at org.apache.beam.repackaged.beam_runners_java_fn_execution.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4992)
at org.apache.beam.runners.fnexecution.control.DockerJobBundleFactory.forStage(DockerJobBundleFactory.java:183)
at org.apache.beam.runners.flink.translation.functions.BatchFlinkExecutableStageContext.getStageBundleFactory(BatchFlinkExecutableStageContext.java:55)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.open(FlinkExecutableStageFunction.java:96)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:494)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetManifest for /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST -> 1 artifacts
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - GetArtifact name: "pickled_main_session"
retrieval_token: "/tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/MANIFEST"
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactRetrievalService - Artifact pickled_main_session located in /tmp/flink-artifacts/job_aec90381-b818-4f85-94cd-9c44e0a1b3ef/artifacts/artifact_ea0d10d07f4601782ed647e8f6ba4a055be13674ab79fa0c6e2fa44917c5264c
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.logging.GrpcLoggingService - Beam Fn Logging client connected.
[grpc-default-executor-2] INFO sdk_worker_main.main - Logging handler created.
[grpc-default-executor-2] INFO sdk_worker_main.main - semi_persistent_directory: /tmp
[grpc-default-executor-2] INFO sdk_worker_main.start - Status HTTP server running at localhost:37529
[grpc-default-executor-2] INFO sdk_worker_main.main - Python sdk harness started with pipeline_options: {u'beam:option:dry_run:v1': False, u'beam:option:harness_docker_image:v1': u'b37d84d99d38', u'beam:option:pipeline_type_check:v1': True, u'beam:option:job_endpoint:v1': u'localhost:8099', u'beam:option:dataflow_endpoint:v1': u'https://dataflow.googleapis.com', u'beam:option:runner:v1': None, u'beam:option:sdk_location:v1': u'container', u'beam:option:direct_runner_use_stacked_bundle:v1': True, u'beam:option:runtime_type_check:v1': False, u'beam:option:flink_master:v1': u'[auto]', u'beam:option:save_main_session:v1': True, u'beam:option:type_check_strictness:v1': u'DEFAULT_TO_ANY', u'beam:option:region:v1': u'us-central1', u'beam:option:profile_memory:v1': False, u'beam:option:profile_cpu:v1': False, u'beam:option:app_name:v1': None, u'beam:option:options_id:v1': 1, u'beam:option:no_auth:v1': False, u'beam:option:streaming:v1': False, u'beam:option:experiments:v1': [u'beam_fn_api'], u'beam:option:job_name:v1': u'BeamApp-ryan-0726042851-cb36f0c0'}
[grpc-default-executor-2] INFO sdk_worker.__init__ - Creating insecure control channel.
[grpc-default-executor-2] INFO sdk_worker.__init__ - Control channel established.
[grpc-default-executor-2] INFO sdk_worker.__init__ - Initializing SDKHarness with 12 workers.
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService - Beam Fn Control client connected with id 1
[grpc-default-executor-2] INFO sdk_worker.run - Got work 1
[grpc-default-executor-2] INFO sdk_worker.run - Got work 2
[grpc-default-executor-2] INFO sdk_worker.create_state_handler - Creating channel for host.docker.internal:58299
[grpc-default-executor-2] INFO data_plane.create_data_channel - Creating channel for host.docker.internal:58298
[grpc-default-executor-2] INFO org.apache.beam.runners.fnexecution.data.GrpcDataService - Beam Fn Data client connected.
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/Extract output_tags=['out']>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/Extract output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/Extract.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770).
[CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) [FINISHED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) 700b330132df4fba480f0ceb1f920770.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0) -> FlatMap (FlatMap at 46write/Write/WriteImpl/GroupByKey/GroupByWindow.None/b37d84d99d38:0/out.0) (1/1) (700b330132df4fba480f0ceb1f920770) switched from RUNNING to FINISHED.
[grpc-default-executor-2] INFO sdk_worker.run - Got work 3
[grpc-default-executor-2] INFO sdk_worker.run - Got work 4
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataOutputOperation >
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/PreFinalize output_tags=['out']>
[grpc-default-executor-2] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DoOperation write/Write/WriteImpl/PreFinalize output_tags=['out'], receivers=[ConsumerSet[write/Write/WriteImpl/PreFinalize.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - finish <DataOutputOperation >
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) switched from RUNNING to FINISHED.
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675).
[CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) [FINISHED]
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FINISHED to JobManager for task CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) 0a6aba573dea3237e38d94ad9b5aa675.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - CHAIN MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1) -> FlatMap (FlatMap at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:1/out.0) (1/1) (0a6aba573dea3237e38d94ad9b5aa675) switched from RUNNING to FINISHED.
[grpc-default-executor-1] INFO sdk_worker.run - Got work 5
[grpc-default-executor-1] INFO sdk_worker.run - Got work 6
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DoOperation write/Write/WriteImpl/FinalizeWrite output_tags=['out']>
[grpc-default-executor-1] INFO bundle_processor.process_bundle - start <DataInputOperation receivers=[ConsumerSet[.out0, coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], len(consumers)=1]]>
[grpc-default-executor-2] ERROR sdk_worker._execute - Error processing instruction 6. Original traceback is
Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] ERROR org.apache.flink.runtime.operators.BatchTask - Error in task code: MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) switched from RUNNING to FAILED.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Freeing task resources for MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed).
[MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1)] INFO org.apache.flink.runtime.taskmanager.Task - Ensuring all FileSystem streams are closed for task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) [FAILED]
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Un-registering task and sending final execution state FAILED to JobManager for task MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) 185456331d0f7dda03861ee300beb4ed.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - MapPartition (MapPartition at 44write/Write/WriteImpl/DoOnce/Read/ReadSplits.None/b37d84d99d38:0) (1/1) (185456331d0f7dda03861ee300beb4ed) switched from RUNNING to FAILED.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d) switched from state RUNNING to FAILING.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 7b7df7e953db2fdd24f1f17880c93209.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 694ea8bb3735c96cc9b30dbb931a8602.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 4c995c29bff1bde56929b01e36a02348.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution edba53a6ec05da0c4967301743cfb4c0.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 5ac4c56d95ef7ebbd120f5a2320eb5c3.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 8aa53bf8e0e369ae1553dcf06266b2c7.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 185d814b196ed19e3c6c77928e468bec.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution c1724171f667445b88042ad0e528bec9.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 05b8173ddd1a42d764868cdf859c9450.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 708d3c4115c1ff76b384fa55fbb4a607.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 31fd1d53dd169890af1b7eb111e0dc86.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution ec11ff0fcfdc22f400da4cd16a99bdfc.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - DataSink (org.apache.flink.api.java.io.DiscardingOutputFormat@7c0d2acc) (1/1) (d3e7e981c55a00d44de5ceec5691db75) switched from CREATED to CANCELED.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 2606b25605378460a235c7788db70cab.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 700b330132df4fba480f0ceb1f920770.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 0a6aba573dea3237e38d94ad9b5aa675.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Discarding the results produced by task execution 185456331d0f7dda03861ee300beb4ed.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Try to restart or fail the job BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d) if no longer possible.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Job BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d) switched from state FAILING to FAILED.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.executiongraph.ExecutionGraph - Could not restart the job BeamApp-ryan-0726042851-cb36f0c0 (d865612786428082b713c0c3ddb6338d) because the restart strategy prevented it.
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Job d865612786428082b713c0c3ddb6338d reached globally terminal state FAILED.
[flink-runner-job-server] INFO org.apache.flink.runtime.minicluster.MiniCluster - Shutting down Flink Mini Cluster
[flink-runner-job-server] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shutting down rest endpoint.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopping TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Stopping the JobMaster for job BeamApp-ryan-0726042851-cb36f0c0(d865612786428082b713c0c3ddb6338d).
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.state.TaskExecutorLocalStateStoresManager - Shutting down TaskExecutorLocalStateStoresManager.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.jobmaster.JobMaster - Close ResourceManager connection 3f1c8f8f26e120d3bc31af149b779c5f: JobManager is shutting down..
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Suspending SlotPool.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.jobmaster.slotpool.SlotPool - Stopping SlotPool.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher akka://flink/user/dispatcher327263a8-9047-41de-b5ea-a1ebcda651b7.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all currently running jobs of dispatcher akka://flink/user/dispatcher327263a8-9047-41de-b5ea-a1ebcda651b7.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - JobManager for job d865612786428082b713c0c3ddb6338d with leader id 9a43b4bdc5c1fc6677f79eafd5e6463f lost leadership.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Disconnect job manager 9a43b4bdc5c1fc6677f79eafd5e6463f@akka://flink/user/jobmanager_1 for job d865612786428082b713c0c3ddb6338d from the resource manager.
[ForkJoinPool.commonPool-worker-4] INFO org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/flink-web-ui
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator - Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-6] INFO org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher akka://flink/user/dispatcher327263a8-9047-41de-b5ea-a1ebcda651b7.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Closing the SlotManager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Suspending the SlotManager.
[flink-akka.actor.default-dispatcher-2] INFO org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Unregister TaskManager 34838cf13f97a5e7037230e6b59e574f from the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.io.disk.iomanager.IOManager - I/O manager removed spill file directory /var/folders/m0/mj2x82p1527349z6mn8btgtr0000gr/T/flink-io-15e5a681-f1fa-41e5-a008-1be13cbdd91e
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the network environment and its components.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader service.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor akka://flink/user/taskmanager_0.
[flink-akka.actor.default-dispatcher-5] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:58215
[flink-akka.actor.default-dispatcher-3] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
[flink-runner-job-server] ERROR org.apache.beam.runners.flink.FlinkJobInvocation - Error during job invocation BeamApp-ryan-0726042851-cb36f0c0_53d0009c-40d9-4611-89b2-528ab81eb15c.
org.apache.flink.runtime.client.JobExecutionException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:625)
at org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:234)
at org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at org.apache.beam.runners.flink.FlinkJobInvocation.runPipeline(FlinkJobInvocation.java:116)
at org.apache.beam.repackaged.beam_runners_flink_2.11.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:111)
at org.apache.beam.repackaged.beam_runners_flink_2.11.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:58)
at org.apache.beam.repackaged.beam_runners_flink_2.11.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:75)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:246)
at org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:119)
at org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:703)
... 1 more
Caused by: java.lang.RuntimeException: Error received from SDK harness for instruction 6: Traceback (most recent call last):
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 134, in _execute
response = task()
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 169, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 215, in do_instruction
request.instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/sdk_worker.py", line 237, in process_bundle
processor.process_bundle(instruction_id)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 299, in process_bundle
input_op.process_encoded(data.data)
File "/usr/local/lib/python2.7/site-packages/apache_beam/runners/worker/bundle_processor.py", line 120, in process_encoded
self.output(decoded_value)
File "apache_beam/runners/worker/operations.py", line 170, in apache_beam.runners.worker.operations.Operation.output
def output(self, windowed_value, output_index=0):
File "apache_beam/runners/worker/operations.py", line 171, in apache_beam.runners.worker.operations.Operation.output
cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
File "apache_beam/runners/worker/operations.py", line 88, in apache_beam.runners.worker.operations.ConsumerSet.receive
cython.cast(Operation, consumer).process(windowed_value)
File "apache_beam/runners/worker/operations.py", line 391, in apache_beam.runners.worker.operations.DoOperation.process
with self.scoped_process_state:
File "apache_beam/runners/worker/operations.py", line 392, in apache_beam.runners.worker.operations.DoOperation.process
self.dofn_receiver.receive(o)
File "apache_beam/runners/common.py", line 591, in apache_beam.runners.common.DoFnRunner.receive
self.process(windowed_value)
File "apache_beam/runners/common.py", line 597, in apache_beam.runners.common.DoFnRunner.process
self._reraise_augmented(exn)
File "apache_beam/runners/common.py", line 630, in apache_beam.runners.common.DoFnRunner._reraise_augmented
raise_(type(new_exn), new_exn, original_traceback)
File "apache_beam/runners/common.py", line 595, in apache_beam.runners.common.DoFnRunner.process
self.do_fn_invoker.invoke_process(windowed_value)
File "apache_beam/runners/common.py", line 474, in apache_beam.runners.common.PerWindowInvoker.invoke_process
self._invoke_per_window(
File "apache_beam/runners/common.py", line 524, in apache_beam.runners.common.PerWindowInvoker._invoke_per_window
output_processor.process_outputs(
File "apache_beam/runners/common.py", line 661, in apache_beam.runners.common._OutputProcessor.process_outputs
def process_outputs(self, windowed_input_element, results):
File "apache_beam/runners/common.py", line 676, in apache_beam.runners.common._OutputProcessor.process_outputs
for result in results:
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/iobase.py", line 1074, in <genexpr>
return (window.TimestampedValue(v, window.MAX_TIMESTAMP) for v in outputs)
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 271, in finalize_write
self._check_state_for_finalize_write(writer_results, num_shards))
File "/usr/local/lib/python2.7/site-packages/apache_beam/io/filebasedsink.py", line 249, in _check_state_for_finalize_write
src, dst))
BeamIOError: src and dst files do not exist. src: /tmp/beam-temp-py-wordcount-direct-6a0d8862908c11e88de8025000000001/5cfa9f22-9246-41fb-adef-ca04d5a5fe50.py-wordcount-direct, dst: /tmp/py-wordcount-direct-00000-of-00001 with exceptions None [while running 'write/Write/WriteImpl/FinalizeWrite'] with exceptions None
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at org.apache.beam.vendor.grpc.v1.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:248)
at org.apache.beam.vendor.grpc.v1.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at org.apache.beam.vendor.grpc.v1.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:263)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:683)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at org.apache.beam.vendor.grpc.v1.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment