Skip to content

Instantly share code, notes, and snippets.

@ramcoelho
Last active December 15, 2020 23:39
Show Gist options
  • Save ramcoelho/cace49435f005afe5e132e98cfe204e4 to your computer and use it in GitHub Desktop.
Save ramcoelho/cace49435f005afe5e132e98cfe204e4 to your computer and use it in GitHub Desktop.
$ docker service logs dg-stack_boot-spark-processor
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 |
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | . ____ _ __ _ _
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | \\/ ___)| |_)| | | | | || (_| | ) ) ) )
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | ' |____| .__|_| |_|_| |_\__, | / / / /
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | =========|_|==============|___/=/_/_/_/
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | :: Spring Boot :: (v2.2.5.RELEASE)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 |
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:29.229 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : Starting DangerousGoodProcessor v0.0.1-SNAPSHOT on 5c1f7f9d3fca with PID 1 (/app/classpath/dangerous-good-processor.original.jar started by root in /)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | . ____ _ __ _ _
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | \\/ ___)| |_)| | | | | || (_| | ) ) ) )
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ' |____| .__|_| |_|_| |_\__, | / / / /
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | =========|_|==============|___/=/_/_/_/
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | :: Spring Boot :: (v2.2.5.RELEASE)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:51.670 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : Starting DangerousGoodProcessor v0.0.1-SNAPSHOT on 25f0969edcf9 with PID 1 (/app/classpath/dangerous-good-processor.original.jar started by root in /)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:51.672 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : No active profile set, falling back to default profiles: default
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:52.373 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Reactive MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:52.685 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 307ms. Found 0 Reactive MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:52.692 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:52.714 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 22ms. Found 1 MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | WARNING: An illegal reflective access operation has occurred
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/app/libs/spark-unsafe_2.11-2.4.5.jar) to method java.nio.Bits.unaligned()
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | WARNING: All illegal access operations will be denied in a future release
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:53.864 INFO 1 --- [ main] org.apache.spark.SparkContext : Running Spark version 2.4.5
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:54.240 WARN 1 --- [ main] org.apache.hadoop.util.NativeCodeLoader : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:54.623 INFO 1 --- [ main] org.apache.spark.SparkContext : Submitted application: dangerous-good-realtime-processor
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:54.681 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls to: root
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:54.682 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls to: root
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:54.682 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls groups to:
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:54.682 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls groups to:
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:54.682 INFO 1 --- [ main] org.apache.spark.SecurityManager : SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.172 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'sparkDriver' on port 36223.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.297 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering MapOutputTracker
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.318 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering BlockManagerMaster
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.321 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.321 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : BlockManagerMasterEndpoint up
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.350 INFO 1 --- [ main] o.apache.spark.storage.DiskBlockManager : Created local directory at /tmp/blockmgr-6007a6a3-f2a9-4a8f-a892-35410ac96c64
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.371 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : MemoryStore started with capacity 9.2 GB
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.395 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering OutputCommitCoordinator
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.479 INFO 1 --- [ main] org.spark_project.jetty.util.log : Logging initialized @5767ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.532 INFO 1 --- [ main] org.spark_project.jetty.server.Server : jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.613 INFO 1 --- [ main] org.spark_project.jetty.server.Server : Started @5901ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.633 INFO 1 --- [ main] o.s.jetty.server.AbstractConnector : Started ServerConnector@472c9f88{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.633 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'SparkUI' on port 4040.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.706 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@be6d228{/jobs,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.707 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7bb4ed71{/jobs/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.707 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5112b7{/jobs/job,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.708 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1c72189f{/jobs/job/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.709 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@14be750c{/stages,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.710 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@58882a93{/stages/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.710 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@15e08615{/stages/stage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.712 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@39ace1a7{/stages/stage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.713 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1930a804{/stages/pool,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.713 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@bd4ee01{/stages/pool/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.713 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7f93f4c{/storage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.714 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@598657cd{/storage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.714 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@456aa471{/storage/rdd,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.715 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@69e2fe3b{/storage/rdd/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.715 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@27afbf14{/environment,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.716 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4cfcac13{/environment/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.721 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5c25d0d1{/executors,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.723 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@22c8ee48{/executors/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.725 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7845b21a{/executors/threadDump,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.726 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@51f01535{/executors/threadDump/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.733 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2321e482{/static,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.734 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3ed7dd70{/,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.734 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6f3b13d0{/api,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.735 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@59b447a4{/jobs/job/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.740 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@69ba3f4e{/stages/stage/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.748 INFO 1 --- [ main] org.apache.spark.ui.SparkUI : Bound SparkUI to 0.0.0.0, and started at http://25f0969edcf9:4040
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-databind-2.10.2.jar at spark://25f0969edcf9:36223/jars/jackson-databind-2.10.2.jar with timestamp 1608051775770
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-api-2.3.1.jar at spark://25f0969edcf9:36223/jars/connect-api-2.3.1.jar with timestamp 1608051775770
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-logging-2.2.5.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-boot-starter-logging-2.2.5.RELEASE.jar with timestamp 1608051775770
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jetty-util-6.1.26.jar at spark://25f0969edcf9:36223/jars/jetty-util-6.1.26.jar with timestamp 1608051775771
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-asl-1.9.13.jar at spark://25f0969edcf9:36223/jars/jackson-core-asl-1.9.13.jar with timestamp 1608051775771
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-util-1.0.0-M20.jar at spark://25f0969edcf9:36223/jars/api-util-1.0.0-M20.jar with timestamp 1608051775771
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-parser-combinators_2.11-1.1.0.jar at spark://25f0969edcf9:36223/jars/scala-parser-combinators_2.11-1.1.0.jar with timestamp 1608051775771
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-streams-2.3.1.jar at spark://25f0969edcf9:36223/jars/kafka-streams-2.3.1.jar with timestamp 1608051775771
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/RoaringBitmap-0.7.45.jar at spark://25f0969edcf9:36223/jars/RoaringBitmap-0.7.45.jar with timestamp 1608051775771
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-api-2.12.1.jar at spark://25f0969edcf9:36223/jars/log4j-api-2.12.1.jar with timestamp 1608051775771
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.activation-api-1.2.0.jar at spark://25f0969edcf9:36223/jars/javax.activation-api-1.2.0.jar with timestamp 1608051775771
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-core-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-mapreduce-client-core-2.6.5.jar with timestamp 1608051775772
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.annotation-api-1.3.5.jar at spark://25f0969edcf9:36223/jars/jakarta.annotation-api-1.3.5.jar with timestamp 1608051775772
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jaxb-api-2.3.1.jar at spark://25f0969edcf9:36223/jars/jaxb-api-2.3.1.jar with timestamp 1608051775772
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-graphite-4.1.3.jar at spark://25f0969edcf9:36223/jars/metrics-graphite-4.1.3.jar with timestamp 1608051775772
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-format-0.10.0.jar at spark://25f0969edcf9:36223/jars/arrow-format-0.10.0.jar with timestamp 1608051775772
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-core-1.2.3.jar at spark://25f0969edcf9:36223/jars/logback-core-1.2.3.jar with timestamp 1608051775772
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-io-common-1.16.0.jar at spark://25f0969edcf9:36223/jars/jts-io-common-1.16.0.jar with timestamp 1608051775772
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-common-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-yarn-common-2.6.5.jar with timestamp 1608051775772
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-autoconfigure-2.2.5.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-boot-autoconfigure-2.2.5.RELEASE.jar with timestamp 1608051775772
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-datatype-jdk8-2.10.2.jar at spark://25f0969edcf9:36223/jars/jackson-datatype-jdk8-2.10.2.jar with timestamp 1608051775773
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/joda-time-2.9.9.jar at spark://25f0969edcf9:36223/jars/joda-time-2.9.9.jar with timestamp 1608051775773
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/stream-2.7.0.jar at spark://25f0969edcf9:36223/jars/stream-2.7.0.jar with timestamp 1608051775773
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xmlenc-0.52.jar at spark://25f0969edcf9:36223/jars/xmlenc-0.52.jar with timestamp 1608051775773
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-math3-3.4.1.jar at spark://25f0969edcf9:36223/jars/commons-math3-3.4.1.jar with timestamp 1608051775773
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-mapreduce-1.5.5-nohive.jar at spark://25f0969edcf9:36223/jars/orc-mapreduce-1.5.5-nohive.jar with timestamp 1608051775773
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar with timestamp 1608051775773
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-mongodb-2.2.5.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-data-mongodb-2.2.5.RELEASE.jar with timestamp 1608051775773
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-spark-connector_2.11-2.4.1.jar at spark://25f0969edcf9:36223/jars/mongo-spark-connector_2.11-2.4.1.jar with timestamp 1608051775773
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-to-slf4j-2.12.1.jar at spark://25f0969edcf9:36223/jars/log4j-to-slf4j-2.12.1.jar with timestamp 1608051775774
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-configuration-processor-2.2.5.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-boot-configuration-processor-2.2.5.RELEASE.jar with timestamp 1608051775774
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/j2objc-annotations-1.3.jar at spark://25f0969edcf9:36223/jars/j2objc-annotations-1.3.jar with timestamp 1608051775774
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-media-jaxb-2.29.1.jar at spark://25f0969edcf9:36223/jars/jersey-media-jaxb-2.29.1.jar with timestamp 1608051775774
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-ipc-1.8.2.jar at spark://25f0969edcf9:36223/jars/avro-ipc-1.8.2.jar with timestamp 1608051775774
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-common_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-network-common_2.11-2.4.5.jar with timestamp 1608051775774
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-common-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-common-2.6.5.jar with timestamp 1608051775774
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-reactivestreams-1.12.0.jar at spark://25f0969edcf9:36223/jars/mongodb-driver-reactivestreams-1.12.0.jar with timestamp 1608051775774
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-core-5.2.4.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-core-5.2.4.RELEASE.jar with timestamp 1608051775775
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jcl-over-slf4j-1.7.30.jar at spark://25f0969edcf9:36223/jars/jcl-over-slf4j-1.7.30.jar with timestamp 1608051775775
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-jobclient-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-mapreduce-client-jobclient-2.6.5.jar with timestamp 1608051775775
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zookeeper-3.4.6.jar at spark://25f0969edcf9:36223/jars/zookeeper-3.4.6.jar with timestamp 1608051775775
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compiler-3.0.8.jar at spark://25f0969edcf9:36223/jars/commons-compiler-3.0.8.jar with timestamp 1608051775775
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-reflect-2.11.12.jar at spark://25f0969edcf9:36223/jars/scala-reflect-2.11.12.jar with timestamp 1608051775778
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-format-2.4.0.jar at spark://25f0969edcf9:36223/jars/parquet-format-2.4.0.jar with timestamp 1608051775778
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compress-1.8.1.jar at spark://25f0969edcf9:36223/jars/commons-compress-1.8.1.jar with timestamp 1608051775778
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/slf4j-api-1.7.30.jar at spark://25f0969edcf9:36223/jars/slf4j-api-1.7.30.jar with timestamp 1608051775778
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-i18n-2.0.0-M15.jar at spark://25f0969edcf9:36223/jars/apacheds-i18n-2.0.0-M15.jar with timestamp 1608051775778
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-commons-2.2.5.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-data-commons-2.2.5.RELEASE.jar with timestamp 1608051775778
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill_2.11-0.9.3.jar at spark://25f0969edcf9:36223/jars/chill_2.11-0.9.3.jar with timestamp 1608051775778
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-classic-1.2.3.jar at spark://25f0969edcf9:36223/jars/logback-classic-1.2.3.jar with timestamp 1608051775778
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/antlr4-runtime-4.7.jar at spark://25f0969edcf9:36223/jars/antlr4-runtime-4.7.jar with timestamp 1608051775778
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-client-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-yarn-client-2.6.5.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hppc-0.7.2.jar at spark://25f0969edcf9:36223/jars/hppc-0.7.2.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/checker-compat-qual-2.5.5.jar at spark://25f0969edcf9:36223/jars/checker-compat-qual-2.5.5.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-2.2.5.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-boot-2.2.5.RELEASE.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-io-2.4.jar at spark://25f0969edcf9:36223/jars/commons-io-2.4.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-2.2.5.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-boot-starter-2.2.5.RELEASE.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-2.10.2.jar at spark://25f0969edcf9:36223/jars/jackson-core-2.10.2.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-jaxrs-1.9.13.jar at spark://25f0969edcf9:36223/jars/jackson-jaxrs-1.9.13.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-aop-5.2.4.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-aop-5.2.4.RELEASE.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/compress-lzf-1.0.3.jar at spark://25f0969edcf9:36223/jars/compress-lzf-1.0.3.jar with timestamp 1608051775779
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-kvstore_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-kvstore_2.11-2.4.5.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-mapred-1.8.2-hadoop2.jar at spark://25f0969edcf9:36223/jars/avro-mapred-1.8.2-hadoop2.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpcore-4.4.13.jar at spark://25f0969edcf9:36223/jars/httpcore-4.4.13.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-paranamer-2.10.2.jar at spark://25f0969edcf9:36223/jars/jackson-module-paranamer-2.10.2.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-crypto-1.0.0.jar at spark://25f0969edcf9:36223/jars/commons-crypto-1.0.0.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-configuration-1.6.jar at spark://25f0969edcf9:36223/jars/commons-configuration-1.6.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-annotations-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-annotations-2.6.5.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-unsafe_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-unsafe_2.11-2.4.5.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-httpclient-3.1.jar at spark://25f0969edcf9:36223/jars/commons-httpclient-3.1.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/flatbuffers-1.2.0-3f79e055.jar at spark://25f0969edcf9:36223/jars/flatbuffers-1.2.0-3f79e055.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar at spark://25f0969edcf9:36223/jars/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar with timestamp 1608051775780
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-jackson-1.10.1.jar at spark://25f0969edcf9:36223/jars/parquet-jackson-1.10.1.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-xc-1.9.13.jar at spark://25f0969edcf9:36223/jars/jackson-xc-1.9.13.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming-kafka-0-10_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-streaming-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/ivy-2.4.0.jar at spark://25f0969edcf9:36223/jars/ivy-2.4.0.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snakeyaml-1.25.jar at spark://25f0969edcf9:36223/jars/snakeyaml-1.25.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/htrace-core-3.0.4.jar at spark://25f0969edcf9:36223/jars/htrace-core-3.0.4.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-annotations-2.10.2.jar at spark://25f0969edcf9:36223/jars/jackson-annotations-2.10.2.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-server-common-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-yarn-server-common-2.6.5.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-recipes-4.0.1.jar at spark://25f0969edcf9:36223/jars/curator-recipes-4.0.1.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-asn1-api-1.0.0-M20.jar at spark://25f0969edcf9:36223/jars/api-asn1-api-1.0.0-M20.jar with timestamp 1608051775781
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/activation-1.1.1.jar at spark://25f0969edcf9:36223/jars/activation-1.1.1.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/objenesis-2.6.jar at spark://25f0969edcf9:36223/jars/objenesis-2.6.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-hadoop-1.10.1.jar at spark://25f0969edcf9:36223/jars/parquet-hadoop-1.10.1.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-1.2.17.jar at spark://25f0969edcf9:36223/jars/log4j-1.2.17.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/leveldbjni-all-1.8.jar at spark://25f0969edcf9:36223/jars/leveldbjni-all-1.8.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/guava-28.2-android.jar at spark://25f0969edcf9:36223/jars/guava-28.2-android.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-streaming_2.11-2.4.5.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-core_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-core_2.11-2.4.5.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-cli-1.2.jar at spark://25f0969edcf9:36223/jars/commons-cli-1.2.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-shuffle_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-network-shuffle_2.11-2.4.5.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-digester-1.8.jar at spark://25f0969edcf9:36223/jars/commons-digester-1.8.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/univocity-parsers-2.7.3.jar at spark://25f0969edcf9:36223/jars/univocity-parsers-2.7.3.jar with timestamp 1608051775782
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-xml_2.11-1.0.6.jar at spark://25f0969edcf9:36223/jars/scala-xml_2.11-1.0.6.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xml-apis-1.3.04.jar at spark://25f0969edcf9:36223/jars/xml-apis-1.3.04.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-jcl-5.2.4.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-jcl-5.2.4.RELEASE.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-jvm-4.1.3.jar at spark://25f0969edcf9:36223/jars/metrics-jvm-4.1.3.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/oro-2.0.8.jar at spark://25f0969edcf9:36223/jars/oro-2.0.8.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/failureaccess-1.0.1.jar at spark://25f0969edcf9:36223/jars/failureaccess-1.0.1.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sketch_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-sketch_2.11-2.4.5.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpclient-4.5.11.jar at spark://25f0969edcf9:36223/jars/httpclient-4.5.11.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-column-1.10.1.jar at spark://25f0969edcf9:36223/jars/parquet-column-1.10.1.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.ws.rs-api-2.1.6.jar at spark://25f0969edcf9:36223/jars/jakarta.ws.rs-api-2.1.6.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-client-2.6.0.jar at spark://25f0969edcf9:36223/jars/curator-client-2.6.0.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/minlog-1.3.0.jar at spark://25f0969edcf9:36223/jars/minlog-1.3.0.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.783 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.inject-2.6.1.jar at spark://25f0969edcf9:36223/jars/jakarta.inject-2.6.1.jar with timestamp 1608051775783
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-core-1.16.0.jar at spark://25f0969edcf9:36223/jars/jts-core-1.16.0.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/unused-1.0.0.jar at spark://25f0969edcf9:36223/jars/unused-1.0.0.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snappy-java-1.1.7.3.jar at spark://25f0969edcf9:36223/jars/snappy-java-1.1.7.3.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-encoding-1.10.1.jar at spark://25f0969edcf9:36223/jars/parquet-encoding-1.10.1.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-api-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-yarn-api-2.6.5.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-shuffle-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-mapreduce-client-shuffle-2.6.5.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-clients-2.3.1.jar at spark://25f0969edcf9:36223/jars/kafka-clients-2.3.1.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-1.8.2.jar at spark://25f0969edcf9:36223/jars/avro-1.8.2.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jul-to-slf4j-1.7.30.jar at spark://25f0969edcf9:36223/jars/jul-to-slf4j-1.7.30.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/gson-2.8.6.jar at spark://25f0969edcf9:36223/jars/gson-2.8.6.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-scala_2.11-2.10.2.jar at spark://25f0969edcf9:36223/jars/jackson-module-scala_2.11-2.10.2.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-net-3.1.jar at spark://25f0969edcf9:36223/jars/commons-net-3.1.jar with timestamp 1608051775784
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.785 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-json-2.3.1.jar at spark://25f0969edcf9:36223/jars/connect-json-2.3.1.jar with timestamp 1608051775785
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.785 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/protobuf-java-3.11.4.jar at spark://25f0969edcf9:36223/jars/protobuf-java-3.11.4.jar with timestamp 1608051775785
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.785 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-retry-1.2.5.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-retry-1.2.5.RELEASE.jar with timestamp 1608051775785
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.785 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactor-core-3.3.3.RELEASE.jar at spark://25f0969edcf9:36223/jars/reactor-core-3.3.3.RELEASE.jar with timestamp 1608051775785
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.785 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-3.11.2.jar at spark://25f0969edcf9:36223/jars/mongodb-driver-3.11.2.jar with timestamp 1608051775785
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill-java-0.9.3.jar at spark://25f0969edcf9:36223/jars/chill-java-0.9.3.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json-simple-1.1.1.jar at spark://25f0969edcf9:36223/jars/json-simple-1.1.1.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-expression-5.2.4.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-expression-5.2.4.RELEASE.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-async-3.11.2.jar at spark://25f0969edcf9:36223/jars/mongodb-driver-async-3.11.2.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.servlet-api-4.0.1.jar at spark://25f0969edcf9:36223/jars/javax.servlet-api-4.0.1.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-client-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-client-2.6.5.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-common-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-mapreduce-client-common-2.6.5.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-memory-0.10.0.jar at spark://25f0969edcf9:36223/jars/arrow-memory-0.10.0.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-mapper-asl-1.9.13.jar at spark://25f0969edcf9:36223/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-hdfs-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-hdfs-2.6.5.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.786 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/py4j-0.10.7.jar at spark://25f0969edcf9:36223/jars/py4j-0.10.7.jar with timestamp 1608051775786
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang3-3.9.jar at spark://25f0969edcf9:36223/jars/commons-lang3-3.9.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql-kafka-0-10_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-sql-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-messaging-5.2.4.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-messaging-5.2.4.RELEASE.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lz4-java-1.4.0.jar at spark://25f0969edcf9:36223/jars/lz4-java-1.4.0.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-scalap_2.11-3.5.3.jar at spark://25f0969edcf9:36223/jars/json4s-scalap_2.11-3.5.3.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-kerberos-codec-2.0.0-M15.jar at spark://25f0969edcf9:36223/jars/apacheds-kerberos-codec-2.0.0-M15.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang-2.6.jar at spark://25f0969edcf9:36223/jars/commons-lang-2.6.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/pyrolite-4.13.jar at spark://25f0969edcf9:36223/jars/pyrolite-4.13.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-catalyst_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-catalyst_2.11-2.4.5.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/aircompressor-0.10.jar at spark://25f0969edcf9:36223/jars/aircompressor-0.10.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-ast_2.11-3.5.3.jar at spark://25f0969edcf9:36223/jars/json4s-ast_2.11-3.5.3.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xercesImpl-2.9.1.jar at spark://25f0969edcf9:36223/jars/xercesImpl-2.9.1.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.787 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kryo-shaded-4.0.2.jar at spark://25f0969edcf9:36223/jars/kryo-shaded-4.0.2.jar with timestamp 1608051775787
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-app-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-mapreduce-client-app-2.6.5.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-json-4.1.3.jar at spark://25f0969edcf9:36223/jars/metrics-json-4.1.3.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-framework-4.0.1.jar at spark://25f0969edcf9:36223/jars/curator-framework-4.0.1.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-client-2.29.1.jar at spark://25f0969edcf9:36223/jars/jersey-client-2.29.1.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-2.1.jar at spark://25f0969edcf9:36223/jars/hamcrest-2.1.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-3.9.9.Final.jar at spark://25f0969edcf9:36223/jars/netty-3.9.9.Final.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-2.29.1.jar at spark://25f0969edcf9:36223/jars/jersey-container-servlet-2.29.1.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-sql_2.11-2.4.5.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-server-2.29.1.jar at spark://25f0969edcf9:36223/jars/jersey-server-2.29.1.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xbean-asm6-shaded-4.10.jar at spark://25f0969edcf9:36223/jars/xbean-asm6-shaded-4.10.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-launcher_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-launcher_2.11-2.4.5.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-beans-5.2.4.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-beans-5.2.4.RELEASE.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.788 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-context-5.2.4.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-context-5.2.4.RELEASE.jar with timestamp 1608051775788
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-library-2.11.12.jar at spark://25f0969edcf9:36223/jars/scala-library-2.11.12.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lombok-1.18.12.jar at spark://25f0969edcf9:36223/jars/lombok-1.18.12.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-kafka-2.3.6.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-kafka-2.3.6.RELEASE.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-java-driver-3.11.2.jar at spark://25f0969edcf9:36223/jars/mongo-java-driver-3.11.2.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-core-2.1.jar at spark://25f0969edcf9:36223/jars/hamcrest-core-2.1.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/shims-0.7.45.jar at spark://25f0969edcf9:36223/jars/shims-0.7.45.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-vector-0.10.0.jar at spark://25f0969edcf9:36223/jars/arrow-vector-0.10.0.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.validation-api-2.0.2.jar at spark://25f0969edcf9:36223/jars/jakarta.validation-api-2.0.2.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jsr305-1.3.9.jar at spark://25f0969edcf9:36223/jars/jsr305-1.3.9.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-core-2.29.1.jar at spark://25f0969edcf9:36223/jars/jersey-container-servlet-core-2.29.1.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-all-4.1.45.Final.jar at spark://25f0969edcf9:36223/jars/netty-all-4.1.45.Final.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-shims-1.5.5.jar at spark://25f0969edcf9:36223/jars/orc-shims-1.5.5.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.789 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/junit-4.12.jar at spark://25f0969edcf9:36223/jars/junit-4.12.jar with timestamp 1608051775789
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-tags_2.11-2.4.5.jar at spark://25f0969edcf9:36223/jars/spark-tags_2.11-2.4.5.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/bson-3.11.2.jar at spark://25f0969edcf9:36223/jars/bson-3.11.2.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/amqp-client-5.7.3.jar at spark://25f0969edcf9:36223/jars/amqp-client-5.7.3.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/rocksdbjni-5.18.3.jar at spark://25f0969edcf9:36223/jars/rocksdbjni-5.18.3.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-collections-3.2.2.jar at spark://25f0969edcf9:36223/jars/commons-collections-3.2.2.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-common-2.29.1.jar at spark://25f0969edcf9:36223/jars/jersey-common-2.29.1.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-codec-1.13.jar at spark://25f0969edcf9:36223/jars/commons-codec-1.13.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zstd-jni-1.3.2-2.jar at spark://25f0969edcf9:36223/jars/zstd-jni-1.3.2-2.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/janino-3.0.8.jar at spark://25f0969edcf9:36223/jars/janino-3.0.8.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-beanutils-1.7.0.jar at spark://25f0969edcf9:36223/jars/commons-beanutils-1.7.0.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-common-1.10.1.jar at spark://25f0969edcf9:36223/jars/parquet-common-1.10.1.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-core-3.11.2.jar at spark://25f0969edcf9:36223/jars/mongodb-driver-core-3.11.2.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactive-streams-1.0.3.jar at spark://25f0969edcf9:36223/jars/reactive-streams-1.0.3.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.790 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xz-1.5.jar at spark://25f0969edcf9:36223/jars/xz-1.5.jar with timestamp 1608051775790
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.791 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-jackson_2.11-3.5.3.jar at spark://25f0969edcf9:36223/jars/json4s-jackson_2.11-3.5.3.jar with timestamp 1608051775791
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.796 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-core-4.1.3.jar at spark://25f0969edcf9:36223/jars/metrics-core-4.1.3.jar with timestamp 1608051775796
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | . ____ _ __ _ _
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.796 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/osgi-resource-locator-1.0.3.jar at spark://25f0969edcf9:36223/jars/osgi-resource-locator-1.0.3.jar with timestamp 1608051775796
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.796 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-auth-2.6.5.jar at spark://25f0969edcf9:36223/jars/hadoop-auth-2.6.5.jar with timestamp 1608051775796
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.796 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-core_2.11-3.5.3.jar at spark://25f0969edcf9:36223/jars/json4s-core_2.11-3.5.3.jar with timestamp 1608051775796
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | \\/ ___)| |_)| | | | | || (_| | ) ) ) )
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ' |____| .__|_| |_|_| |_\__, | / / / /
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.797 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-tx-5.2.4.RELEASE.jar at spark://25f0969edcf9:36223/jars/spring-tx-5.2.4.RELEASE.jar with timestamp 1608051775797
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | =========|_|==============|___/=/_/_/_/
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.797 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-core-1.5.5-nohive.jar at spark://25f0969edcf9:36223/jars/orc-core-1.5.5-nohive.jar with timestamp 1608051775797
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | :: Spring Boot :: (v2.2.5.RELEASE)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.797 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/error_prone_annotations-2.3.4.jar at spark://25f0969edcf9:36223/jars/error_prone_annotations-2.3.4.jar with timestamp 1608051775797
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.797 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/paranamer-2.8.jar at spark://25f0969edcf9:36223/jars/paranamer-2.8.jar with timestamp 1608051775797
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:48.339 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : Starting DangerousGoodProcessor v0.0.1-SNAPSHOT on 034ba3310158 with PID 1 (/app/classpath/dangerous-good-processor.original.jar started by root in /)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:48.341 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : No active profile set, falling back to default profiles: default
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:55.797 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/classpath/dangerous-good-processor.original.jar at spark://25f0969edcf9:36223/jars/dangerous-good-processor.original.jar with timestamp 1608051775797
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.033 INFO 1 --- [er-threadpool-0] s.d.c.StandaloneAppClient$ClientEndpoint : Connecting to master spark://spark-master:7077...
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:48.810 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Reactive MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.224 INFO 1 --- [pc-connection-0] o.a.s.n.client.TransportClientFactory : Successfully created connection to spark-master/10.0.6.246:7077 after 110 ms (0 ms spent in bootstraps)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:48.947 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 133ms. Found 0 Reactive MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.341 INFO 1 --- [er-event-loop-5] o.a.s.s.c.StandaloneSchedulerBackend : Connected to Spark cluster with app ID app-20201215170256-0008
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:48.954 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.342 INFO 1 --- [r-event-loop-12] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20201215170256-0008/0 on worker-20201215165722-10.0.6.251-38117 (10.0.6.251:38117) with 2 core(s)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:48.976 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 22ms. Found 1 MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.343 INFO 1 --- [r-event-loop-12] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20201215170256-0008/0 on hostPort 10.0.6.251:38117 with 2 core(s), 1024.0 MB RAM
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | WARNING: An illegal reflective access operation has occurred
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.365 INFO 1 --- [r-event-loop-13] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20201215170256-0008/0 is now RUNNING
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/app/libs/spark-unsafe_2.11-2.4.5.jar) to method java.nio.Bits.unaligned()
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.368 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 38871.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | WARNING: All illegal access operations will be denied in a future release
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:49.664 INFO 1 --- [ main] org.apache.spark.SparkContext : Running Spark version 2.4.5
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:49.930 WARN 1 --- [ main] org.apache.hadoop.util.NativeCodeLoader : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.082 INFO 1 --- [ main] org.apache.spark.SparkContext : Submitted application: dangerous-good-realtime-processor
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.127 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls to: root
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.128 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls to: root
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.128 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls groups to:
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.128 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls groups to:
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.128 INFO 1 --- [ main] org.apache.spark.SecurityManager : SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.437 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'sparkDriver' on port 41881.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.459 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering MapOutputTracker
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.476 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering BlockManagerMaster
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.484 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.484 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : BlockManagerMasterEndpoint up
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.499 INFO 1 --- [ main] o.apache.spark.storage.DiskBlockManager : Created local directory at /tmp/blockmgr-39dc576d-04b8-49ce-afb6-812f4e94ebff
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.514 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : MemoryStore started with capacity 9.2 GB
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.527 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering OutputCommitCoordinator
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.576 INFO 1 --- [ main] org.spark_project.jetty.util.log : Logging initialized @3174ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.616 INFO 1 --- [ main] org.spark_project.jetty.server.Server : jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.637 INFO 1 --- [ main] org.spark_project.jetty.server.Server : Started @3234ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.662 INFO 1 --- [ main] o.s.jetty.server.AbstractConnector : Started ServerConnector@27bb4dc5{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.663 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'SparkUI' on port 4040.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.716 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@11d2714a{/jobs,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.716 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6c9bf3b5{/jobs/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.717 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6f07d414{/jobs/job,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.717 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@734fbae3{/jobs/job/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.718 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@223967ea{/stages,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.718 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4a6a6a69{/stages/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.718 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5d7f1e59{/stages/stage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.719 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1c00d406{/stages/stage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.720 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@16ac4d3d{/stages/pool,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.720 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@559d19c{/stages/pool/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.721 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@719c1faf{/storage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.721 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6ba6557e{/storage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.722 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1f172892{/storage/rdd,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | . ____ _ __ _ _
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.722 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5409dfdd{/storage/rdd/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.723 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@45f9d394{/environment,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.723 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1c2d63f0{/environment/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.724 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3a588b5f{/executors,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | \\/ ___)| |_)| | | | | || (_| | ) ) ) )
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.724 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@20b54cfe{/executors/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ' |____| .__|_| |_|_| |_\__, | / / / /
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.725 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2bdb5e0f{/executors/threadDump,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | =========|_|==============|___/=/_/_/_/
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.726 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5c9e8a67{/executors/threadDump/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | :: Spring Boot :: (v2.2.5.RELEASE)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.731 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2262f0d8{/static,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.731 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2864f887{/,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:34.391 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : Starting DangerousGoodProcessor v0.0.1-SNAPSHOT on c899ec0ab125 with PID 1 (/app/classpath/dangerous-good-processor.original.jar started by root in /)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.732 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4b07cad0{/api,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:34.393 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : No active profile set, falling back to default profiles: default
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.733 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@349f0ca4{/jobs/job/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:35.254 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Reactive MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.734 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@66f2ec1c{/stages/stage/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:35.707 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 390ms. Found 0 Reactive MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.735 INFO 1 --- [ main] org.apache.spark.ui.SparkUI : Bound SparkUI to 0.0.0.0, and started at http://034ba3310158:4040
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:35.711 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.756 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-databind-2.10.2.jar at spark://034ba3310158:41881/jars/jackson-databind-2.10.2.jar with timestamp 1608051650751
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:35.740 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 29ms. Found 1 MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.757 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-api-2.3.1.jar at spark://034ba3310158:41881/jars/connect-api-2.3.1.jar with timestamp 1608051650757
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | WARNING: An illegal reflective access operation has occurred
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.759 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-logging-2.2.5.RELEASE.jar at spark://034ba3310158:41881/jars/spring-boot-starter-logging-2.2.5.RELEASE.jar with timestamp 1608051650759
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/app/libs/spark-unsafe_2.11-2.4.5.jar) to method java.nio.Bits.unaligned()
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.759 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jetty-util-6.1.26.jar at spark://034ba3310158:41881/jars/jetty-util-6.1.26.jar with timestamp 1608051650759
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.760 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-asl-1.9.13.jar at spark://034ba3310158:41881/jars/jackson-core-asl-1.9.13.jar with timestamp 1608051650760
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.369 INFO 1 --- [ main] o.a.s.n.netty.NettyBlockTransferService : Server created on 25f0969edcf9:38871
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.760 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-util-1.0.0-M20.jar at spark://034ba3310158:41881/jars/api-util-1.0.0-M20.jar with timestamp 1608051650760
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:29.231 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : No active profile set, falling back to default profiles: default
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | WARNING: All illegal access operations will be denied in a future release
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.760 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-parser-combinators_2.11-1.1.0.jar at spark://034ba3310158:41881/jars/scala-parser-combinators_2.11-1.1.0.jar with timestamp 1608051650760
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:29.954 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Reactive MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:36.543 INFO 1 --- [ main] org.apache.spark.SparkContext : Running Spark version 2.4.5
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.760 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-streams-2.3.1.jar at spark://034ba3310158:41881/jars/kafka-streams-2.3.1.jar with timestamp 1608051650760
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:30.109 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 151ms. Found 0 Reactive MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:36.894 WARN 1 --- [ main] org.apache.hadoop.util.NativeCodeLoader : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:30.113 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.760 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/RoaringBitmap-0.7.45.jar at spark://034ba3310158:41881/jars/RoaringBitmap-0.7.45.jar with timestamp 1608051650760
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:37.371 INFO 1 --- [ main] org.apache.spark.SparkContext : Submitted application: dangerous-good-realtime-processor
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:30.149 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 35ms. Found 1 MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.760 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-api-2.12.1.jar at spark://034ba3310158:41881/jars/log4j-api-2.12.1.jar with timestamp 1608051650760
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:37.414 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls to: root
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | WARNING: An illegal reflective access operation has occurred
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:37.415 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls to: root
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.760 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.activation-api-1.2.0.jar at spark://034ba3310158:41881/jars/javax.activation-api-1.2.0.jar with timestamp 1608051650760
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/app/libs/spark-unsafe_2.11-2.4.5.jar) to method java.nio.Bits.unaligned()
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:37.415 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls groups to:
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.761 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-core-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-mapreduce-client-core-2.6.5.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:37.415 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls groups to:
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.761 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.annotation-api-1.3.5.jar at spark://034ba3310158:41881/jars/jakarta.annotation-api-1.3.5.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:37.416 INFO 1 --- [ main] org.apache.spark.SecurityManager : SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.761 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jaxb-api-2.3.1.jar at spark://034ba3310158:41881/jars/jaxb-api-2.3.1.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | WARNING: All illegal access operations will be denied in a future release
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:37.941 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'sparkDriver' on port 34063.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.761 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-graphite-4.1.3.jar at spark://034ba3310158:41881/jars/metrics-graphite-4.1.3.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:31.217 INFO 1 --- [ main] org.apache.spark.SparkContext : Running Spark version 2.4.5
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.033 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering MapOutputTracker
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.761 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-format-0.10.0.jar at spark://034ba3310158:41881/jars/arrow-format-0.10.0.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:31.677 WARN 1 --- [ main] org.apache.hadoop.util.NativeCodeLoader : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.070 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering BlockManagerMaster
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.761 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-core-1.2.3.jar at spark://034ba3310158:41881/jars/logback-core-1.2.3.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.088 INFO 1 --- [ main] org.apache.spark.SparkContext : Submitted application: dangerous-good-realtime-processor
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.761 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-io-common-1.16.0.jar at spark://034ba3310158:41881/jars/jts-io-common-1.16.0.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.140 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls to: root
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.761 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-common-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-yarn-common-2.6.5.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.140 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls to: root
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.761 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-autoconfigure-2.2.5.RELEASE.jar at spark://034ba3310158:41881/jars/spring-boot-autoconfigure-2.2.5.RELEASE.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.140 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls groups to:
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.762 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-datatype-jdk8-2.10.2.jar at spark://034ba3310158:41881/jars/jackson-datatype-jdk8-2.10.2.jar with timestamp 1608051650761
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.141 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls groups to:
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.141 INFO 1 --- [ main] org.apache.spark.SecurityManager : SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.762 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/joda-time-2.9.9.jar at spark://034ba3310158:41881/jars/joda-time-2.9.9.jar with timestamp 1608051650762
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.669 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'sparkDriver' on port 44051.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.762 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/stream-2.7.0.jar at spark://034ba3310158:41881/jars/stream-2.7.0.jar with timestamp 1608051650762
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.744 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering MapOutputTracker
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.762 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xmlenc-0.52.jar at spark://034ba3310158:41881/jars/xmlenc-0.52.jar with timestamp 1608051650762
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.775 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering BlockManagerMaster
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.764 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-math3-3.4.1.jar at spark://034ba3310158:41881/jars/commons-math3-3.4.1.jar with timestamp 1608051650764
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.764 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-mapreduce-1.5.5-nohive.jar at spark://034ba3310158:41881/jars/orc-mapreduce-1.5.5-nohive.jar with timestamp 1608051650764
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.764 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar at spark://034ba3310158:41881/jars/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar with timestamp 1608051650764
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.764 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-mongodb-2.2.5.RELEASE.jar at spark://034ba3310158:41881/jars/spring-data-mongodb-2.2.5.RELEASE.jar with timestamp 1608051650764
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.765 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-spark-connector_2.11-2.4.1.jar at spark://034ba3310158:41881/jars/mongo-spark-connector_2.11-2.4.1.jar with timestamp 1608051650765
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.765 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-to-slf4j-2.12.1.jar at spark://034ba3310158:41881/jars/log4j-to-slf4j-2.12.1.jar with timestamp 1608051650765
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.765 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-configuration-processor-2.2.5.RELEASE.jar at spark://034ba3310158:41881/jars/spring-boot-configuration-processor-2.2.5.RELEASE.jar with timestamp 1608051650765
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.765 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/j2objc-annotations-1.3.jar at spark://034ba3310158:41881/jars/j2objc-annotations-1.3.jar with timestamp 1608051650765
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.765 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-media-jaxb-2.29.1.jar at spark://034ba3310158:41881/jars/jersey-media-jaxb-2.29.1.jar with timestamp 1608051650765
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.765 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-ipc-1.8.2.jar at spark://034ba3310158:41881/jars/avro-ipc-1.8.2.jar with timestamp 1608051650765
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.777 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.765 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-common_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-network-common_2.11-2.4.5.jar with timestamp 1608051650765
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.784 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : BlockManagerMasterEndpoint up
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.832 INFO 1 --- [ main] o.apache.spark.storage.DiskBlockManager : Created local directory at /tmp/blockmgr-badda641-989b-40e2-9535-e3ac18be91fb
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.765 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-common-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-common-2.6.5.jar with timestamp 1608051650765
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.766 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-reactivestreams-1.12.0.jar at spark://034ba3310158:41881/jars/mongodb-driver-reactivestreams-1.12.0.jar with timestamp 1608051650766
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.852 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : MemoryStore started with capacity 9.2 GB
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.766 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-core-5.2.4.RELEASE.jar at spark://034ba3310158:41881/jars/spring-core-5.2.4.RELEASE.jar with timestamp 1608051650766
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.884 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering OutputCommitCoordinator
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:32.950 INFO 1 --- [ main] org.spark_project.jetty.util.log : Logging initialized @5735ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.766 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jcl-over-slf4j-1.7.30.jar at spark://034ba3310158:41881/jars/jcl-over-slf4j-1.7.30.jar with timestamp 1608051650766
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.012 INFO 1 --- [ main] org.spark_project.jetty.server.Server : jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.766 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-jobclient-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-mapreduce-client-jobclient-2.6.5.jar with timestamp 1608051650766
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.140 INFO 1 --- [ main] org.spark_project.jetty.server.Server : Started @5925ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.766 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zookeeper-3.4.6.jar at spark://034ba3310158:41881/jars/zookeeper-3.4.6.jar with timestamp 1608051650766
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.152 INFO 1 --- [ main] o.s.jetty.server.AbstractConnector : Started ServerConnector@2864f887{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.766 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compiler-3.0.8.jar at spark://034ba3310158:41881/jars/commons-compiler-3.0.8.jar with timestamp 1608051650766
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.152 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'SparkUI' on port 4040.
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.187 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5d3f99d7{/jobs,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.768 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-reflect-2.11.12.jar at spark://034ba3310158:41881/jars/scala-reflect-2.11.12.jar with timestamp 1608051650768
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.769 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-format-2.4.0.jar at spark://034ba3310158:41881/jars/parquet-format-2.4.0.jar with timestamp 1608051650769
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.187 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@52ca0ad4{/jobs/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.769 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compress-1.8.1.jar at spark://034ba3310158:41881/jars/commons-compress-1.8.1.jar with timestamp 1608051650769
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.188 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4536a715{/jobs/job,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.769 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/slf4j-api-1.7.30.jar at spark://034ba3310158:41881/jars/slf4j-api-1.7.30.jar with timestamp 1608051650769
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.188 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@35dcd032{/jobs/job/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.769 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-i18n-2.0.0-M15.jar at spark://034ba3310158:41881/jars/apacheds-i18n-2.0.0-M15.jar with timestamp 1608051650769
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.189 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4522d793{/stages,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.769 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-commons-2.2.5.RELEASE.jar at spark://034ba3310158:41881/jars/spring-data-commons-2.2.5.RELEASE.jar with timestamp 1608051650769
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.193 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@64dfb31d{/stages/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.769 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill_2.11-0.9.3.jar at spark://034ba3310158:41881/jars/chill_2.11-0.9.3.jar with timestamp 1608051650769
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.194 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4438b862{/stages/stage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.769 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-classic-1.2.3.jar at spark://034ba3310158:41881/jars/logback-classic-1.2.3.jar with timestamp 1608051650769
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.195 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7e27f603{/stages/stage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.769 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/antlr4-runtime-4.7.jar at spark://034ba3310158:41881/jars/antlr4-runtime-4.7.jar with timestamp 1608051650769
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.195 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6136e1fc{/stages/pool,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.769 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-client-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-yarn-client-2.6.5.jar with timestamp 1608051650769
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.196 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1736c1e4{/stages/pool/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hppc-0.7.2.jar at spark://034ba3310158:41881/jars/hppc-0.7.2.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.196 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@32dcfeea{/storage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/checker-compat-qual-2.5.5.jar at spark://034ba3310158:41881/jars/checker-compat-qual-2.5.5.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.197 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@738a815c{/storage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.197 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2565a7d0{/storage/rdd,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-2.2.5.RELEASE.jar at spark://034ba3310158:41881/jars/spring-boot-2.2.5.RELEASE.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.199 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4fd7b79{/storage/rdd/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-io-2.4.jar at spark://034ba3310158:41881/jars/commons-io-2.4.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.199 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5103eea2{/environment,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.200 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@762405bf{/environment/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.072 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-2.2.5.RELEASE.jar at spark://034ba3310158:41881/jars/spring-boot-starter-2.2.5.RELEASE.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-2.10.2.jar at spark://034ba3310158:41881/jars/jackson-core-2.10.2.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-jaxrs-1.9.13.jar at spark://034ba3310158:41881/jars/jackson-jaxrs-1.9.13.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-aop-5.2.4.RELEASE.jar at spark://034ba3310158:41881/jars/spring-aop-5.2.4.RELEASE.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/compress-lzf-1.0.3.jar at spark://034ba3310158:41881/jars/compress-lzf-1.0.3.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.770 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-kvstore_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-kvstore_2.11-2.4.5.jar with timestamp 1608051650770
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-mapred-1.8.2-hadoop2.jar at spark://034ba3310158:41881/jars/avro-mapred-1.8.2-hadoop2.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpcore-4.4.13.jar at spark://034ba3310158:41881/jars/httpcore-4.4.13.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-paranamer-2.10.2.jar at spark://034ba3310158:41881/jars/jackson-module-paranamer-2.10.2.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-crypto-1.0.0.jar at spark://034ba3310158:41881/jars/commons-crypto-1.0.0.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-configuration-1.6.jar at spark://034ba3310158:41881/jars/commons-configuration-1.6.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-annotations-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-annotations-2.6.5.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-unsafe_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-unsafe_2.11-2.4.5.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-httpclient-3.1.jar at spark://034ba3310158:41881/jars/commons-httpclient-3.1.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/flatbuffers-1.2.0-3f79e055.jar at spark://034ba3310158:41881/jars/flatbuffers-1.2.0-3f79e055.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.771 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar at spark://034ba3310158:41881/jars/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar with timestamp 1608051650771
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-jackson-1.10.1.jar at spark://034ba3310158:41881/jars/parquet-jackson-1.10.1.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-xc-1.9.13.jar at spark://034ba3310158:41881/jars/jackson-xc-1.9.13.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming-kafka-0-10_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-streaming-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/ivy-2.4.0.jar at spark://034ba3310158:41881/jars/ivy-2.4.0.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snakeyaml-1.25.jar at spark://034ba3310158:41881/jars/snakeyaml-1.25.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/htrace-core-3.0.4.jar at spark://034ba3310158:41881/jars/htrace-core-3.0.4.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-annotations-2.10.2.jar at spark://034ba3310158:41881/jars/jackson-annotations-2.10.2.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-server-common-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-yarn-server-common-2.6.5.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-recipes-4.0.1.jar at spark://034ba3310158:41881/jars/curator-recipes-4.0.1.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.772 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-asn1-api-1.0.0-M20.jar at spark://034ba3310158:41881/jars/api-asn1-api-1.0.0-M20.jar with timestamp 1608051650772
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/activation-1.1.1.jar at spark://034ba3310158:41881/jars/activation-1.1.1.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/objenesis-2.6.jar at spark://034ba3310158:41881/jars/objenesis-2.6.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-hadoop-1.10.1.jar at spark://034ba3310158:41881/jars/parquet-hadoop-1.10.1.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-1.2.17.jar at spark://034ba3310158:41881/jars/log4j-1.2.17.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/leveldbjni-all-1.8.jar at spark://034ba3310158:41881/jars/leveldbjni-all-1.8.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/guava-28.2-android.jar at spark://034ba3310158:41881/jars/guava-28.2-android.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-streaming_2.11-2.4.5.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-core_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-core_2.11-2.4.5.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-cli-1.2.jar at spark://034ba3310158:41881/jars/commons-cli-1.2.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-shuffle_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-network-shuffle_2.11-2.4.5.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-digester-1.8.jar at spark://034ba3310158:41881/jars/commons-digester-1.8.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.773 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/univocity-parsers-2.7.3.jar at spark://034ba3310158:41881/jars/univocity-parsers-2.7.3.jar with timestamp 1608051650773
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-xml_2.11-1.0.6.jar at spark://034ba3310158:41881/jars/scala-xml_2.11-1.0.6.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xml-apis-1.3.04.jar at spark://034ba3310158:41881/jars/xml-apis-1.3.04.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-jcl-5.2.4.RELEASE.jar at spark://034ba3310158:41881/jars/spring-jcl-5.2.4.RELEASE.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-jvm-4.1.3.jar at spark://034ba3310158:41881/jars/metrics-jvm-4.1.3.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/oro-2.0.8.jar at spark://034ba3310158:41881/jars/oro-2.0.8.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/failureaccess-1.0.1.jar at spark://034ba3310158:41881/jars/failureaccess-1.0.1.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sketch_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-sketch_2.11-2.4.5.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpclient-4.5.11.jar at spark://034ba3310158:41881/jars/httpclient-4.5.11.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-column-1.10.1.jar at spark://034ba3310158:41881/jars/parquet-column-1.10.1.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.ws.rs-api-2.1.6.jar at spark://034ba3310158:41881/jars/jakarta.ws.rs-api-2.1.6.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-client-2.6.0.jar at spark://034ba3310158:41881/jars/curator-client-2.6.0.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.774 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/minlog-1.3.0.jar at spark://034ba3310158:41881/jars/minlog-1.3.0.jar with timestamp 1608051650774
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.inject-2.6.1.jar at spark://034ba3310158:41881/jars/jakarta.inject-2.6.1.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-core-1.16.0.jar at spark://034ba3310158:41881/jars/jts-core-1.16.0.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/unused-1.0.0.jar at spark://034ba3310158:41881/jars/unused-1.0.0.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snappy-java-1.1.7.3.jar at spark://034ba3310158:41881/jars/snappy-java-1.1.7.3.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-encoding-1.10.1.jar at spark://034ba3310158:41881/jars/parquet-encoding-1.10.1.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-api-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-yarn-api-2.6.5.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-shuffle-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-mapreduce-client-shuffle-2.6.5.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-clients-2.3.1.jar at spark://034ba3310158:41881/jars/kafka-clients-2.3.1.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-1.8.2.jar at spark://034ba3310158:41881/jars/avro-1.8.2.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jul-to-slf4j-1.7.30.jar at spark://034ba3310158:41881/jars/jul-to-slf4j-1.7.30.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.775 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/gson-2.8.6.jar at spark://034ba3310158:41881/jars/gson-2.8.6.jar with timestamp 1608051650775
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-scala_2.11-2.10.2.jar at spark://034ba3310158:41881/jars/jackson-module-scala_2.11-2.10.2.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-net-3.1.jar at spark://034ba3310158:41881/jars/commons-net-3.1.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-json-2.3.1.jar at spark://034ba3310158:41881/jars/connect-json-2.3.1.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.200 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7e77678c{/executors,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.201 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@113eed88{/executors/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.201 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5f0677f3{/executors/threadDump,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.204 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@9d99851{/executors/threadDump/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.212 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4cd7d5e1{/static,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.213 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@46b5f061{/,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.213 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3ba3f40d{/api,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.214 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@b5312df{/jobs/job/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.215 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4300e240{/stages/stage/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.216 INFO 1 --- [ main] org.apache.spark.ui.SparkUI : Bound SparkUI to 0.0.0.0, and started at http://5c1f7f9d3fca:4040
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.261 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-databind-2.10.2.jar at spark://5c1f7f9d3fca:44051/jars/jackson-databind-2.10.2.jar with timestamp 1608051693261
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.261 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-api-2.3.1.jar at spark://5c1f7f9d3fca:44051/jars/connect-api-2.3.1.jar with timestamp 1608051693261
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.261 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-logging-2.2.5.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-boot-starter-logging-2.2.5.RELEASE.jar with timestamp 1608051693261
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.261 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jetty-util-6.1.26.jar at spark://5c1f7f9d3fca:44051/jars/jetty-util-6.1.26.jar with timestamp 1608051693261
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.261 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-asl-1.9.13.jar at spark://5c1f7f9d3fca:44051/jars/jackson-core-asl-1.9.13.jar with timestamp 1608051693261
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.262 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-util-1.0.0-M20.jar at spark://5c1f7f9d3fca:44051/jars/api-util-1.0.0-M20.jar with timestamp 1608051693262
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.262 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-parser-combinators_2.11-1.1.0.jar at spark://5c1f7f9d3fca:44051/jars/scala-parser-combinators_2.11-1.1.0.jar with timestamp 1608051693262
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.262 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-streams-2.3.1.jar at spark://5c1f7f9d3fca:44051/jars/kafka-streams-2.3.1.jar with timestamp 1608051693262
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.262 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/RoaringBitmap-0.7.45.jar at spark://5c1f7f9d3fca:44051/jars/RoaringBitmap-0.7.45.jar with timestamp 1608051693262
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.262 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-api-2.12.1.jar at spark://5c1f7f9d3fca:44051/jars/log4j-api-2.12.1.jar with timestamp 1608051693262
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.262 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.activation-api-1.2.0.jar at spark://5c1f7f9d3fca:44051/jars/javax.activation-api-1.2.0.jar with timestamp 1608051693262
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.262 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-core-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-mapreduce-client-core-2.6.5.jar with timestamp 1608051693262
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.262 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.annotation-api-1.3.5.jar at spark://5c1f7f9d3fca:44051/jars/jakarta.annotation-api-1.3.5.jar with timestamp 1608051693262
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.262 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jaxb-api-2.3.1.jar at spark://5c1f7f9d3fca:44051/jars/jaxb-api-2.3.1.jar with timestamp 1608051693262
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.263 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-graphite-4.1.3.jar at spark://5c1f7f9d3fca:44051/jars/metrics-graphite-4.1.3.jar with timestamp 1608051693263
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.263 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-format-0.10.0.jar at spark://5c1f7f9d3fca:44051/jars/arrow-format-0.10.0.jar with timestamp 1608051693263
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.263 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-core-1.2.3.jar at spark://5c1f7f9d3fca:44051/jars/logback-core-1.2.3.jar with timestamp 1608051693263
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.263 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-io-common-1.16.0.jar at spark://5c1f7f9d3fca:44051/jars/jts-io-common-1.16.0.jar with timestamp 1608051693263
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.263 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-common-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-yarn-common-2.6.5.jar with timestamp 1608051693263
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.263 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-autoconfigure-2.2.5.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-boot-autoconfigure-2.2.5.RELEASE.jar with timestamp 1608051693263
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.263 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-datatype-jdk8-2.10.2.jar at spark://5c1f7f9d3fca:44051/jars/jackson-datatype-jdk8-2.10.2.jar with timestamp 1608051693263
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.263 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/joda-time-2.9.9.jar at spark://5c1f7f9d3fca:44051/jars/joda-time-2.9.9.jar with timestamp 1608051693263
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.263 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/stream-2.7.0.jar at spark://5c1f7f9d3fca:44051/jars/stream-2.7.0.jar with timestamp 1608051693263
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.264 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xmlenc-0.52.jar at spark://5c1f7f9d3fca:44051/jars/xmlenc-0.52.jar with timestamp 1608051693264
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.264 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-math3-3.4.1.jar at spark://5c1f7f9d3fca:44051/jars/commons-math3-3.4.1.jar with timestamp 1608051693264
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.264 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-mapreduce-1.5.5-nohive.jar at spark://5c1f7f9d3fca:44051/jars/orc-mapreduce-1.5.5-nohive.jar with timestamp 1608051693264
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.264 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar with timestamp 1608051693264
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.264 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-mongodb-2.2.5.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-data-mongodb-2.2.5.RELEASE.jar with timestamp 1608051693264
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.264 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-spark-connector_2.11-2.4.1.jar at spark://5c1f7f9d3fca:44051/jars/mongo-spark-connector_2.11-2.4.1.jar with timestamp 1608051693264
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.264 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-to-slf4j-2.12.1.jar at spark://5c1f7f9d3fca:44051/jars/log4j-to-slf4j-2.12.1.jar with timestamp 1608051693264
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.264 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-configuration-processor-2.2.5.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-boot-configuration-processor-2.2.5.RELEASE.jar with timestamp 1608051693264
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.265 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/j2objc-annotations-1.3.jar at spark://5c1f7f9d3fca:44051/jars/j2objc-annotations-1.3.jar with timestamp 1608051693265
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.265 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-media-jaxb-2.29.1.jar at spark://5c1f7f9d3fca:44051/jars/jersey-media-jaxb-2.29.1.jar with timestamp 1608051693265
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.265 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-ipc-1.8.2.jar at spark://5c1f7f9d3fca:44051/jars/avro-ipc-1.8.2.jar with timestamp 1608051693265
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.265 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-common_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-network-common_2.11-2.4.5.jar with timestamp 1608051693265
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.265 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-common-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-common-2.6.5.jar with timestamp 1608051693265
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.265 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-reactivestreams-1.12.0.jar at spark://5c1f7f9d3fca:44051/jars/mongodb-driver-reactivestreams-1.12.0.jar with timestamp 1608051693265
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.265 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-core-5.2.4.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-core-5.2.4.RELEASE.jar with timestamp 1608051693265
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.265 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jcl-over-slf4j-1.7.30.jar at spark://5c1f7f9d3fca:44051/jars/jcl-over-slf4j-1.7.30.jar with timestamp 1608051693265
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.265 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-jobclient-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-mapreduce-client-jobclient-2.6.5.jar with timestamp 1608051693265
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.266 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zookeeper-3.4.6.jar at spark://5c1f7f9d3fca:44051/jars/zookeeper-3.4.6.jar with timestamp 1608051693266
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.266 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compiler-3.0.8.jar at spark://5c1f7f9d3fca:44051/jars/commons-compiler-3.0.8.jar with timestamp 1608051693266
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.266 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-reflect-2.11.12.jar at spark://5c1f7f9d3fca:44051/jars/scala-reflect-2.11.12.jar with timestamp 1608051693266
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.266 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-format-2.4.0.jar at spark://5c1f7f9d3fca:44051/jars/parquet-format-2.4.0.jar with timestamp 1608051693266
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.266 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compress-1.8.1.jar at spark://5c1f7f9d3fca:44051/jars/commons-compress-1.8.1.jar with timestamp 1608051693266
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.266 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/slf4j-api-1.7.30.jar at spark://5c1f7f9d3fca:44051/jars/slf4j-api-1.7.30.jar with timestamp 1608051693266
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.266 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-i18n-2.0.0-M15.jar at spark://5c1f7f9d3fca:44051/jars/apacheds-i18n-2.0.0-M15.jar with timestamp 1608051693266
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.266 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-commons-2.2.5.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-data-commons-2.2.5.RELEASE.jar with timestamp 1608051693266
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.266 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill_2.11-0.9.3.jar at spark://5c1f7f9d3fca:44051/jars/chill_2.11-0.9.3.jar with timestamp 1608051693266
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-classic-1.2.3.jar at spark://5c1f7f9d3fca:44051/jars/logback-classic-1.2.3.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/antlr4-runtime-4.7.jar at spark://5c1f7f9d3fca:44051/jars/antlr4-runtime-4.7.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-client-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-yarn-client-2.6.5.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hppc-0.7.2.jar at spark://5c1f7f9d3fca:44051/jars/hppc-0.7.2.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/checker-compat-qual-2.5.5.jar at spark://5c1f7f9d3fca:44051/jars/checker-compat-qual-2.5.5.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-2.2.5.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-boot-2.2.5.RELEASE.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-io-2.4.jar at spark://5c1f7f9d3fca:44051/jars/commons-io-2.4.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-2.2.5.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-boot-starter-2.2.5.RELEASE.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-2.10.2.jar at spark://5c1f7f9d3fca:44051/jars/jackson-core-2.10.2.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.267 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-jaxrs-1.9.13.jar at spark://5c1f7f9d3fca:44051/jars/jackson-jaxrs-1.9.13.jar with timestamp 1608051693267
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.268 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-aop-5.2.4.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-aop-5.2.4.RELEASE.jar with timestamp 1608051693268
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.268 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/compress-lzf-1.0.3.jar at spark://5c1f7f9d3fca:44051/jars/compress-lzf-1.0.3.jar with timestamp 1608051693268
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.268 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-kvstore_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-kvstore_2.11-2.4.5.jar with timestamp 1608051693268
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.273 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-mapred-1.8.2-hadoop2.jar at spark://5c1f7f9d3fca:44051/jars/avro-mapred-1.8.2-hadoop2.jar with timestamp 1608051693273
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.273 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpcore-4.4.13.jar at spark://5c1f7f9d3fca:44051/jars/httpcore-4.4.13.jar with timestamp 1608051693273
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.273 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-paranamer-2.10.2.jar at spark://5c1f7f9d3fca:44051/jars/jackson-module-paranamer-2.10.2.jar with timestamp 1608051693273
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.273 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-crypto-1.0.0.jar at spark://5c1f7f9d3fca:44051/jars/commons-crypto-1.0.0.jar with timestamp 1608051693273
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.273 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-configuration-1.6.jar at spark://5c1f7f9d3fca:44051/jars/commons-configuration-1.6.jar with timestamp 1608051693273
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.273 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-annotations-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-annotations-2.6.5.jar with timestamp 1608051693273
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.273 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-unsafe_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-unsafe_2.11-2.4.5.jar with timestamp 1608051693273
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.273 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-httpclient-3.1.jar at spark://5c1f7f9d3fca:44051/jars/commons-httpclient-3.1.jar with timestamp 1608051693273
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.274 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/flatbuffers-1.2.0-3f79e055.jar at spark://5c1f7f9d3fca:44051/jars/flatbuffers-1.2.0-3f79e055.jar with timestamp 1608051693274
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.274 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar at spark://5c1f7f9d3fca:44051/jars/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar with timestamp 1608051693274
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.274 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-jackson-1.10.1.jar at spark://5c1f7f9d3fca:44051/jars/parquet-jackson-1.10.1.jar with timestamp 1608051693274
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.274 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-xc-1.9.13.jar at spark://5c1f7f9d3fca:44051/jars/jackson-xc-1.9.13.jar with timestamp 1608051693274
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.280 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming-kafka-0-10_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-streaming-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051693274
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.280 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/ivy-2.4.0.jar at spark://5c1f7f9d3fca:44051/jars/ivy-2.4.0.jar with timestamp 1608051693280
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.280 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snakeyaml-1.25.jar at spark://5c1f7f9d3fca:44051/jars/snakeyaml-1.25.jar with timestamp 1608051693280
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.280 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/htrace-core-3.0.4.jar at spark://5c1f7f9d3fca:44051/jars/htrace-core-3.0.4.jar with timestamp 1608051693280
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.280 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-annotations-2.10.2.jar at spark://5c1f7f9d3fca:44051/jars/jackson-annotations-2.10.2.jar with timestamp 1608051693280
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-server-common-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-yarn-server-common-2.6.5.jar with timestamp 1608051693280
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-recipes-4.0.1.jar at spark://5c1f7f9d3fca:44051/jars/curator-recipes-4.0.1.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-asn1-api-1.0.0-M20.jar at spark://5c1f7f9d3fca:44051/jars/api-asn1-api-1.0.0-M20.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/activation-1.1.1.jar at spark://5c1f7f9d3fca:44051/jars/activation-1.1.1.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/objenesis-2.6.jar at spark://5c1f7f9d3fca:44051/jars/objenesis-2.6.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-hadoop-1.10.1.jar at spark://5c1f7f9d3fca:44051/jars/parquet-hadoop-1.10.1.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-1.2.17.jar at spark://5c1f7f9d3fca:44051/jars/log4j-1.2.17.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/leveldbjni-all-1.8.jar at spark://5c1f7f9d3fca:44051/jars/leveldbjni-all-1.8.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/guava-28.2-android.jar at spark://5c1f7f9d3fca:44051/jars/guava-28.2-android.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-streaming_2.11-2.4.5.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-core_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-core_2.11-2.4.5.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-cli-1.2.jar at spark://5c1f7f9d3fca:44051/jars/commons-cli-1.2.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.281 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-shuffle_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-network-shuffle_2.11-2.4.5.jar with timestamp 1608051693281
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-digester-1.8.jar at spark://5c1f7f9d3fca:44051/jars/commons-digester-1.8.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/univocity-parsers-2.7.3.jar at spark://5c1f7f9d3fca:44051/jars/univocity-parsers-2.7.3.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-xml_2.11-1.0.6.jar at spark://5c1f7f9d3fca:44051/jars/scala-xml_2.11-1.0.6.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xml-apis-1.3.04.jar at spark://5c1f7f9d3fca:44051/jars/xml-apis-1.3.04.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-jcl-5.2.4.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-jcl-5.2.4.RELEASE.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-jvm-4.1.3.jar at spark://5c1f7f9d3fca:44051/jars/metrics-jvm-4.1.3.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/oro-2.0.8.jar at spark://5c1f7f9d3fca:44051/jars/oro-2.0.8.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/failureaccess-1.0.1.jar at spark://5c1f7f9d3fca:44051/jars/failureaccess-1.0.1.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sketch_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-sketch_2.11-2.4.5.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpclient-4.5.11.jar at spark://5c1f7f9d3fca:44051/jars/httpclient-4.5.11.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.282 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-column-1.10.1.jar at spark://5c1f7f9d3fca:44051/jars/parquet-column-1.10.1.jar with timestamp 1608051693282
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.ws.rs-api-2.1.6.jar at spark://5c1f7f9d3fca:44051/jars/jakarta.ws.rs-api-2.1.6.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-client-2.6.0.jar at spark://5c1f7f9d3fca:44051/jars/curator-client-2.6.0.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/minlog-1.3.0.jar at spark://5c1f7f9d3fca:44051/jars/minlog-1.3.0.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.inject-2.6.1.jar at spark://5c1f7f9d3fca:44051/jars/jakarta.inject-2.6.1.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-core-1.16.0.jar at spark://5c1f7f9d3fca:44051/jars/jts-core-1.16.0.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/unused-1.0.0.jar at spark://5c1f7f9d3fca:44051/jars/unused-1.0.0.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snappy-java-1.1.7.3.jar at spark://5c1f7f9d3fca:44051/jars/snappy-java-1.1.7.3.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-encoding-1.10.1.jar at spark://5c1f7f9d3fca:44051/jars/parquet-encoding-1.10.1.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-api-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-yarn-api-2.6.5.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-shuffle-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-mapreduce-client-shuffle-2.6.5.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-clients-2.3.1.jar at spark://5c1f7f9d3fca:44051/jars/kafka-clients-2.3.1.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.283 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-1.8.2.jar at spark://5c1f7f9d3fca:44051/jars/avro-1.8.2.jar with timestamp 1608051693283
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jul-to-slf4j-1.7.30.jar at spark://5c1f7f9d3fca:44051/jars/jul-to-slf4j-1.7.30.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/gson-2.8.6.jar at spark://5c1f7f9d3fca:44051/jars/gson-2.8.6.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-scala_2.11-2.10.2.jar at spark://5c1f7f9d3fca:44051/jars/jackson-module-scala_2.11-2.10.2.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-net-3.1.jar at spark://5c1f7f9d3fca:44051/jars/commons-net-3.1.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-json-2.3.1.jar at spark://5c1f7f9d3fca:44051/jars/connect-json-2.3.1.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/protobuf-java-3.11.4.jar at spark://5c1f7f9d3fca:44051/jars/protobuf-java-3.11.4.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-retry-1.2.5.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-retry-1.2.5.RELEASE.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactor-core-3.3.3.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/reactor-core-3.3.3.RELEASE.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-3.11.2.jar at spark://5c1f7f9d3fca:44051/jars/mongodb-driver-3.11.2.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill-java-0.9.3.jar at spark://5c1f7f9d3fca:44051/jars/chill-java-0.9.3.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.284 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json-simple-1.1.1.jar at spark://5c1f7f9d3fca:44051/jars/json-simple-1.1.1.jar with timestamp 1608051693284
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-expression-5.2.4.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-expression-5.2.4.RELEASE.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-async-3.11.2.jar at spark://5c1f7f9d3fca:44051/jars/mongodb-driver-async-3.11.2.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.servlet-api-4.0.1.jar at spark://5c1f7f9d3fca:44051/jars/javax.servlet-api-4.0.1.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-client-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-client-2.6.5.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-common-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-mapreduce-client-common-2.6.5.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-memory-0.10.0.jar at spark://5c1f7f9d3fca:44051/jars/arrow-memory-0.10.0.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-mapper-asl-1.9.13.jar at spark://5c1f7f9d3fca:44051/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-hdfs-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-hdfs-2.6.5.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/py4j-0.10.7.jar at spark://5c1f7f9d3fca:44051/jars/py4j-0.10.7.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang3-3.9.jar at spark://5c1f7f9d3fca:44051/jars/commons-lang3-3.9.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.285 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql-kafka-0-10_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-sql-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-messaging-5.2.4.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-messaging-5.2.4.RELEASE.jar with timestamp 1608051693285
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lz4-java-1.4.0.jar at spark://5c1f7f9d3fca:44051/jars/lz4-java-1.4.0.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-scalap_2.11-3.5.3.jar at spark://5c1f7f9d3fca:44051/jars/json4s-scalap_2.11-3.5.3.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-kerberos-codec-2.0.0-M15.jar at spark://5c1f7f9d3fca:44051/jars/apacheds-kerberos-codec-2.0.0-M15.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang-2.6.jar at spark://5c1f7f9d3fca:44051/jars/commons-lang-2.6.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/pyrolite-4.13.jar at spark://5c1f7f9d3fca:44051/jars/pyrolite-4.13.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-catalyst_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-catalyst_2.11-2.4.5.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/aircompressor-0.10.jar at spark://5c1f7f9d3fca:44051/jars/aircompressor-0.10.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-ast_2.11-3.5.3.jar at spark://5c1f7f9d3fca:44051/jars/json4s-ast_2.11-3.5.3.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xercesImpl-2.9.1.jar at spark://5c1f7f9d3fca:44051/jars/xercesImpl-2.9.1.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.286 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kryo-shaded-4.0.2.jar at spark://5c1f7f9d3fca:44051/jars/kryo-shaded-4.0.2.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-app-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-mapreduce-client-app-2.6.5.jar with timestamp 1608051693286
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-json-4.1.3.jar at spark://5c1f7f9d3fca:44051/jars/metrics-json-4.1.3.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-framework-4.0.1.jar at spark://5c1f7f9d3fca:44051/jars/curator-framework-4.0.1.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-client-2.29.1.jar at spark://5c1f7f9d3fca:44051/jars/jersey-client-2.29.1.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-2.1.jar at spark://5c1f7f9d3fca:44051/jars/hamcrest-2.1.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-3.9.9.Final.jar at spark://5c1f7f9d3fca:44051/jars/netty-3.9.9.Final.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-2.29.1.jar at spark://5c1f7f9d3fca:44051/jars/jersey-container-servlet-2.29.1.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-sql_2.11-2.4.5.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-server-2.29.1.jar at spark://5c1f7f9d3fca:44051/jars/jersey-server-2.29.1.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xbean-asm6-shaded-4.10.jar at spark://5c1f7f9d3fca:44051/jars/xbean-asm6-shaded-4.10.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-launcher_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-launcher_2.11-2.4.5.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-beans-5.2.4.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-beans-5.2.4.RELEASE.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.287 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-context-5.2.4.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-context-5.2.4.RELEASE.jar with timestamp 1608051693287
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-library-2.11.12.jar at spark://5c1f7f9d3fca:44051/jars/scala-library-2.11.12.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lombok-1.18.12.jar at spark://5c1f7f9d3fca:44051/jars/lombok-1.18.12.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-kafka-2.3.6.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-kafka-2.3.6.RELEASE.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-java-driver-3.11.2.jar at spark://5c1f7f9d3fca:44051/jars/mongo-java-driver-3.11.2.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-core-2.1.jar at spark://5c1f7f9d3fca:44051/jars/hamcrest-core-2.1.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/shims-0.7.45.jar at spark://5c1f7f9d3fca:44051/jars/shims-0.7.45.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-vector-0.10.0.jar at spark://5c1f7f9d3fca:44051/jars/arrow-vector-0.10.0.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.073 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : BlockManagerMasterEndpoint up
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.100 INFO 1 --- [ main] o.apache.spark.storage.DiskBlockManager : Created local directory at /tmp/blockmgr-0b7e434e-8260-4f60-b8fd-ea892abd9c8c
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.117 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : MemoryStore started with capacity 9.2 GB
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.138 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering OutputCommitCoordinator
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.217 INFO 1 --- [ main] org.spark_project.jetty.util.log : Logging initialized @5592ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.258 INFO 1 --- [ main] org.spark_project.jetty.server.Server : jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.306 INFO 1 --- [ main] org.spark_project.jetty.server.Server : Started @5681ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.318 INFO 1 --- [ main] o.s.jetty.server.AbstractConnector : Started ServerConnector@113eed88{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.319 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'SparkUI' on port 4040.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.337 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@48a46b0f{/jobs,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.337 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1a17dd6f{/jobs/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.338 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@be6d228{/jobs/job,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.338 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5ab63a04{/jobs/job/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.339 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@677cc4e8{/stages,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.339 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4c6bba7d{/stages/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.339 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@41e8d917{/stages/stage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.345 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3a3ad8e7{/stages/stage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.345 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6a6d1ff3{/stages/pool,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.346 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@64b73e0a{/stages/pool/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.346 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@57c6feea{/storage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.347 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3b57f915{/storage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.347 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@39c7fb0b{/storage/rdd,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.348 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@645dc557{/storage/rdd/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.352 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@48c5698{/environment,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.353 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@17884d{/environment/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.353 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@68e24e7{/executors,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.354 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@c732e1c{/executors/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.354 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6aad919c{/executors/threadDump,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.355 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@56c42964{/executors/threadDump/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.378 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@211a9647{/static,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.384 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1930a804{/,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.385 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@bd4ee01{/api,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.386 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@69e2fe3b{/jobs/job/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.387 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@27afbf14{/stages/stage/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.388 INFO 1 --- [ main] org.apache.spark.ui.SparkUI : Bound SparkUI to 0.0.0.0, and started at http://c899ec0ab125:4040
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.411 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-databind-2.10.2.jar at spark://c899ec0ab125:34063/jars/jackson-databind-2.10.2.jar with timestamp 1608051818411
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.411 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-api-2.3.1.jar at spark://c899ec0ab125:34063/jars/connect-api-2.3.1.jar with timestamp 1608051818411
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.412 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-logging-2.2.5.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-boot-starter-logging-2.2.5.RELEASE.jar with timestamp 1608051818412
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.412 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jetty-util-6.1.26.jar at spark://c899ec0ab125:34063/jars/jetty-util-6.1.26.jar with timestamp 1608051818412
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.412 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-asl-1.9.13.jar at spark://c899ec0ab125:34063/jars/jackson-core-asl-1.9.13.jar with timestamp 1608051818412
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.412 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-util-1.0.0-M20.jar at spark://c899ec0ab125:34063/jars/api-util-1.0.0-M20.jar with timestamp 1608051818412
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.412 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-parser-combinators_2.11-1.1.0.jar at spark://c899ec0ab125:34063/jars/scala-parser-combinators_2.11-1.1.0.jar with timestamp 1608051818412
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.412 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-streams-2.3.1.jar at spark://c899ec0ab125:34063/jars/kafka-streams-2.3.1.jar with timestamp 1608051818412
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.412 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/RoaringBitmap-0.7.45.jar at spark://c899ec0ab125:34063/jars/RoaringBitmap-0.7.45.jar with timestamp 1608051818412
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.412 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-api-2.12.1.jar at spark://c899ec0ab125:34063/jars/log4j-api-2.12.1.jar with timestamp 1608051818412
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.413 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.activation-api-1.2.0.jar at spark://c899ec0ab125:34063/jars/javax.activation-api-1.2.0.jar with timestamp 1608051818413
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.413 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-core-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-mapreduce-client-core-2.6.5.jar with timestamp 1608051818413
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.413 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.annotation-api-1.3.5.jar at spark://c899ec0ab125:34063/jars/jakarta.annotation-api-1.3.5.jar with timestamp 1608051818413
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.413 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jaxb-api-2.3.1.jar at spark://c899ec0ab125:34063/jars/jaxb-api-2.3.1.jar with timestamp 1608051818413
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.413 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-graphite-4.1.3.jar at spark://c899ec0ab125:34063/jars/metrics-graphite-4.1.3.jar with timestamp 1608051818413
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.413 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-format-0.10.0.jar at spark://c899ec0ab125:34063/jars/arrow-format-0.10.0.jar with timestamp 1608051818413
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.413 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-core-1.2.3.jar at spark://c899ec0ab125:34063/jars/logback-core-1.2.3.jar with timestamp 1608051818413
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.413 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-io-common-1.16.0.jar at spark://c899ec0ab125:34063/jars/jts-io-common-1.16.0.jar with timestamp 1608051818413
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.413 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-common-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-yarn-common-2.6.5.jar with timestamp 1608051818413
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.414 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-autoconfigure-2.2.5.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-boot-autoconfigure-2.2.5.RELEASE.jar with timestamp 1608051818414
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.414 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-datatype-jdk8-2.10.2.jar at spark://c899ec0ab125:34063/jars/jackson-datatype-jdk8-2.10.2.jar with timestamp 1608051818414
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.414 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/joda-time-2.9.9.jar at spark://c899ec0ab125:34063/jars/joda-time-2.9.9.jar with timestamp 1608051818414
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.414 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/stream-2.7.0.jar at spark://c899ec0ab125:34063/jars/stream-2.7.0.jar with timestamp 1608051818414
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.414 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xmlenc-0.52.jar at spark://c899ec0ab125:34063/jars/xmlenc-0.52.jar with timestamp 1608051818414
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.414 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-math3-3.4.1.jar at spark://c899ec0ab125:34063/jars/commons-math3-3.4.1.jar with timestamp 1608051818414
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.414 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-mapreduce-1.5.5-nohive.jar at spark://c899ec0ab125:34063/jars/orc-mapreduce-1.5.5-nohive.jar with timestamp 1608051818414
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.414 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar with timestamp 1608051818414
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.414 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-mongodb-2.2.5.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-data-mongodb-2.2.5.RELEASE.jar with timestamp 1608051818414
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.415 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-spark-connector_2.11-2.4.1.jar at spark://c899ec0ab125:34063/jars/mongo-spark-connector_2.11-2.4.1.jar with timestamp 1608051818415
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.415 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-to-slf4j-2.12.1.jar at spark://c899ec0ab125:34063/jars/log4j-to-slf4j-2.12.1.jar with timestamp 1608051818415
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.415 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-configuration-processor-2.2.5.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-boot-configuration-processor-2.2.5.RELEASE.jar with timestamp 1608051818415
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.415 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/j2objc-annotations-1.3.jar at spark://c899ec0ab125:34063/jars/j2objc-annotations-1.3.jar with timestamp 1608051818415
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.415 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-media-jaxb-2.29.1.jar at spark://c899ec0ab125:34063/jars/jersey-media-jaxb-2.29.1.jar with timestamp 1608051818415
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.415 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-ipc-1.8.2.jar at spark://c899ec0ab125:34063/jars/avro-ipc-1.8.2.jar with timestamp 1608051818415
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.417 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-common_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-network-common_2.11-2.4.5.jar with timestamp 1608051818417
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.417 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-common-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-common-2.6.5.jar with timestamp 1608051818417
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.417 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-reactivestreams-1.12.0.jar at spark://c899ec0ab125:34063/jars/mongodb-driver-reactivestreams-1.12.0.jar with timestamp 1608051818417
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.417 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-core-5.2.4.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-core-5.2.4.RELEASE.jar with timestamp 1608051818417
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.417 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jcl-over-slf4j-1.7.30.jar at spark://c899ec0ab125:34063/jars/jcl-over-slf4j-1.7.30.jar with timestamp 1608051818417
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.417 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-jobclient-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-mapreduce-client-jobclient-2.6.5.jar with timestamp 1608051818417
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.417 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zookeeper-3.4.6.jar at spark://c899ec0ab125:34063/jars/zookeeper-3.4.6.jar with timestamp 1608051818417
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.417 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compiler-3.0.8.jar at spark://c899ec0ab125:34063/jars/commons-compiler-3.0.8.jar with timestamp 1608051818417
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.417 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-reflect-2.11.12.jar at spark://c899ec0ab125:34063/jars/scala-reflect-2.11.12.jar with timestamp 1608051818417
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.418 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-format-2.4.0.jar at spark://c899ec0ab125:34063/jars/parquet-format-2.4.0.jar with timestamp 1608051818418
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.418 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compress-1.8.1.jar at spark://c899ec0ab125:34063/jars/commons-compress-1.8.1.jar with timestamp 1608051818418
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.418 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/slf4j-api-1.7.30.jar at spark://c899ec0ab125:34063/jars/slf4j-api-1.7.30.jar with timestamp 1608051818418
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.418 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-i18n-2.0.0-M15.jar at spark://c899ec0ab125:34063/jars/apacheds-i18n-2.0.0-M15.jar with timestamp 1608051818418
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.418 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-commons-2.2.5.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-data-commons-2.2.5.RELEASE.jar with timestamp 1608051818418
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.418 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill_2.11-0.9.3.jar at spark://c899ec0ab125:34063/jars/chill_2.11-0.9.3.jar with timestamp 1608051818418
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.418 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-classic-1.2.3.jar at spark://c899ec0ab125:34063/jars/logback-classic-1.2.3.jar with timestamp 1608051818418
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.418 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/antlr4-runtime-4.7.jar at spark://c899ec0ab125:34063/jars/antlr4-runtime-4.7.jar with timestamp 1608051818418
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.418 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-client-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-yarn-client-2.6.5.jar with timestamp 1608051818418
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.419 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hppc-0.7.2.jar at spark://c899ec0ab125:34063/jars/hppc-0.7.2.jar with timestamp 1608051818419
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.419 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/checker-compat-qual-2.5.5.jar at spark://c899ec0ab125:34063/jars/checker-compat-qual-2.5.5.jar with timestamp 1608051818419
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.419 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-2.2.5.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-boot-2.2.5.RELEASE.jar with timestamp 1608051818419
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.419 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-io-2.4.jar at spark://c899ec0ab125:34063/jars/commons-io-2.4.jar with timestamp 1608051818419
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.419 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-2.2.5.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-boot-starter-2.2.5.RELEASE.jar with timestamp 1608051818419
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.419 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-2.10.2.jar at spark://c899ec0ab125:34063/jars/jackson-core-2.10.2.jar with timestamp 1608051818419
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.419 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-jaxrs-1.9.13.jar at spark://c899ec0ab125:34063/jars/jackson-jaxrs-1.9.13.jar with timestamp 1608051818419
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.419 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-aop-5.2.4.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-aop-5.2.4.RELEASE.jar with timestamp 1608051818419
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.419 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/compress-lzf-1.0.3.jar at spark://c899ec0ab125:34063/jars/compress-lzf-1.0.3.jar with timestamp 1608051818419
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.420 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-kvstore_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-kvstore_2.11-2.4.5.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.420 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-mapred-1.8.2-hadoop2.jar at spark://c899ec0ab125:34063/jars/avro-mapred-1.8.2-hadoop2.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.420 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpcore-4.4.13.jar at spark://c899ec0ab125:34063/jars/httpcore-4.4.13.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.420 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-paranamer-2.10.2.jar at spark://c899ec0ab125:34063/jars/jackson-module-paranamer-2.10.2.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.420 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-crypto-1.0.0.jar at spark://c899ec0ab125:34063/jars/commons-crypto-1.0.0.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.420 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-configuration-1.6.jar at spark://c899ec0ab125:34063/jars/commons-configuration-1.6.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.420 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-annotations-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-annotations-2.6.5.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.420 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-unsafe_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-unsafe_2.11-2.4.5.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.420 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-httpclient-3.1.jar at spark://c899ec0ab125:34063/jars/commons-httpclient-3.1.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.421 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/flatbuffers-1.2.0-3f79e055.jar at spark://c899ec0ab125:34063/jars/flatbuffers-1.2.0-3f79e055.jar with timestamp 1608051818420
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.421 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar at spark://c899ec0ab125:34063/jars/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar with timestamp 1608051818421
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.421 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-jackson-1.10.1.jar at spark://c899ec0ab125:34063/jars/parquet-jackson-1.10.1.jar with timestamp 1608051818421
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.421 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-xc-1.9.13.jar at spark://c899ec0ab125:34063/jars/jackson-xc-1.9.13.jar with timestamp 1608051818421
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.421 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming-kafka-0-10_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-streaming-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051818421
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.421 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/ivy-2.4.0.jar at spark://c899ec0ab125:34063/jars/ivy-2.4.0.jar with timestamp 1608051818421
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.421 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snakeyaml-1.25.jar at spark://c899ec0ab125:34063/jars/snakeyaml-1.25.jar with timestamp 1608051818421
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.421 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/htrace-core-3.0.4.jar at spark://c899ec0ab125:34063/jars/htrace-core-3.0.4.jar with timestamp 1608051818421
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.421 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-annotations-2.10.2.jar at spark://c899ec0ab125:34063/jars/jackson-annotations-2.10.2.jar with timestamp 1608051818421
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-server-common-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-yarn-server-common-2.6.5.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-recipes-4.0.1.jar at spark://c899ec0ab125:34063/jars/curator-recipes-4.0.1.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-asn1-api-1.0.0-M20.jar at spark://c899ec0ab125:34063/jars/api-asn1-api-1.0.0-M20.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/activation-1.1.1.jar at spark://c899ec0ab125:34063/jars/activation-1.1.1.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/objenesis-2.6.jar at spark://c899ec0ab125:34063/jars/objenesis-2.6.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-hadoop-1.10.1.jar at spark://c899ec0ab125:34063/jars/parquet-hadoop-1.10.1.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-1.2.17.jar at spark://c899ec0ab125:34063/jars/log4j-1.2.17.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/leveldbjni-all-1.8.jar at spark://c899ec0ab125:34063/jars/leveldbjni-all-1.8.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/guava-28.2-android.jar at spark://c899ec0ab125:34063/jars/guava-28.2-android.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.422 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-streaming_2.11-2.4.5.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-core_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-core_2.11-2.4.5.jar with timestamp 1608051818422
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-cli-1.2.jar at spark://c899ec0ab125:34063/jars/commons-cli-1.2.jar with timestamp 1608051818423
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-shuffle_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-network-shuffle_2.11-2.4.5.jar with timestamp 1608051818423
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-digester-1.8.jar at spark://c899ec0ab125:34063/jars/commons-digester-1.8.jar with timestamp 1608051818423
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/univocity-parsers-2.7.3.jar at spark://c899ec0ab125:34063/jars/univocity-parsers-2.7.3.jar with timestamp 1608051818423
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-xml_2.11-1.0.6.jar at spark://c899ec0ab125:34063/jars/scala-xml_2.11-1.0.6.jar with timestamp 1608051818423
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xml-apis-1.3.04.jar at spark://c899ec0ab125:34063/jars/xml-apis-1.3.04.jar with timestamp 1608051818423
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-jcl-5.2.4.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-jcl-5.2.4.RELEASE.jar with timestamp 1608051818423
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-jvm-4.1.3.jar at spark://c899ec0ab125:34063/jars/metrics-jvm-4.1.3.jar with timestamp 1608051818423
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.423 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/oro-2.0.8.jar at spark://c899ec0ab125:34063/jars/oro-2.0.8.jar with timestamp 1608051818423
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.424 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/failureaccess-1.0.1.jar at spark://c899ec0ab125:34063/jars/failureaccess-1.0.1.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.424 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sketch_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-sketch_2.11-2.4.5.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.424 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpclient-4.5.11.jar at spark://c899ec0ab125:34063/jars/httpclient-4.5.11.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.424 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-column-1.10.1.jar at spark://c899ec0ab125:34063/jars/parquet-column-1.10.1.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.424 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.ws.rs-api-2.1.6.jar at spark://c899ec0ab125:34063/jars/jakarta.ws.rs-api-2.1.6.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.424 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-client-2.6.0.jar at spark://c899ec0ab125:34063/jars/curator-client-2.6.0.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.424 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/minlog-1.3.0.jar at spark://c899ec0ab125:34063/jars/minlog-1.3.0.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.424 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.inject-2.6.1.jar at spark://c899ec0ab125:34063/jars/jakarta.inject-2.6.1.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.424 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-core-1.16.0.jar at spark://c899ec0ab125:34063/jars/jts-core-1.16.0.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/unused-1.0.0.jar at spark://c899ec0ab125:34063/jars/unused-1.0.0.jar with timestamp 1608051818424
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snappy-java-1.1.7.3.jar at spark://c899ec0ab125:34063/jars/snappy-java-1.1.7.3.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.370 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.389 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registering BlockManager BlockManagerId(driver, 25f0969edcf9, 38871, None)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.394 INFO 1 --- [r-event-loop-14] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 25f0969edcf9:38871 with 9.2 GB RAM, BlockManagerId(driver, 25f0969edcf9, 38871, None)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.396 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registered BlockManager BlockManagerId(driver, 25f0969edcf9, 38871, None)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.416 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Initialized BlockManager: BlockManagerId(driver, 25f0969edcf9, 38871, None)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.544 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@59f3426f{/metrics/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:56.580 INFO 1 --- [ main] o.a.s.s.c.StandaloneSchedulerBackend : SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.199 WARN 1 --- [ main] org.apache.spark.SparkContext : Using an existing SparkContext; some configuration may not take effect.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.674 INFO 1 --- [ main] org.mongodb.driver.cluster : Cluster created with settings {hosts=[mongo-server:27017], mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500, requiredReplicaSetName='rs0'}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.675 INFO 1 --- [ main] org.mongodb.driver.cluster : Adding discovered server mongo-server:27017 to client view of cluster
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.766 INFO 1 --- [go-server:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:1, serverValue:37}] to mongo-server:27017
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.777 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-server:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=2880366, setName='rs0', canonicalAddress=mongo-server:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=7fffffff000000000000001d, setVersion=37028, lastWriteDate=Tue Dec 15 17:02:55 UTC 2020, lastUpdateTimeNanos=1845494304719813}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.779 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-0:27017 to client view of cluster
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.780 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-1:27017 to client view of cluster
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.783 INFO 1 --- [replica-0:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:2, serverValue:34}] to mongo-replica-0:27017
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.784 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max election id to 7fffffff000000000000001d from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.784 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max set version to 37028 from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.784 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Discovered replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.787 INFO 1 --- [replica-1:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:3, serverValue:36}] to mongo-replica-1:27017
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.787 INFO 1 --- [replica-0:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-0:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=3567615, setName='rs0', canonicalAddress=mongo-replica-0:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:02:55 UTC 2020, lastUpdateTimeNanos=1845494321364298}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:57.787 INFO 1 --- [replica-1:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-1:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=669485, setName='rs0', canonicalAddress=mongo-replica-1:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:02:55 UTC 2020, lastUpdateTimeNanos=1845494322310054}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:58.421 INFO 1 --- [ main] org.mongodb.driver.connection : Opened connection [connectionId{localValue:4, serverValue:38}] to mongo-server:27017
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ===========================================================
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | PortalMetadata(id=5f51f7ecdf7c147d8529cc8d, portalName=P7, expiration=60, asset=com.tecnositaf.dg.processor.model.assets.Asset@5570dc21)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.054 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0 stored as values in memory (estimated size 22.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.144 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.2 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.146 INFO 1 --- [er-event-loop-7] o.apache.spark.storage.BlockManagerInfo : Added broadcast_0_piece0 in memory on 25f0969edcf9:38871 (size: 2.2 KB, free: 9.2 GB)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.156 INFO 1 --- [ main] org.apache.spark.SparkContext : Created broadcast 0 from broadcast at RealtimeProcessor.java:81
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.168 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.168 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Warehouse path is 'file:/spark-warehouse'.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.183 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@441762b8{/SQL,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.184 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2d2fe68a{/SQL/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.184 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5e75cf7a{/SQL/execution,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.185 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6f12fca0{/SQL/execution/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.186 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@70680f88{/static/sql,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.652 INFO 1 --- [ main] o.a.s.s.e.s.s.StateStoreCoordinatorRef : Registered StateStoreCoordinator endpoint
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.808 INFO 1 --- [er-event-loop-6] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.6.251:51754) with ID 0
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.948 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding enable.auto.commit to false for executor
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.948 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding auto.offset.reset to none for executor
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.949 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding executor group.id to spark-executor-realtime-group
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:02:59.949 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding receive.buffer.bytes to 65536 see KAFKA-3135
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | starting dangerous good processor
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.122 INFO 1 --- [er-event-loop-2] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 10.0.6.251:41429 with 366.3 MB RAM, BlockManagerId(0, 10.0.6.251, 41429, None)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.255 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.ShuffledDStream@2b8a11d7
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.255 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.MappedDStream@467625e3
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.255 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.TransformedDStream@57e2d15c
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.255 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@9214725
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.257 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.257 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.257 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@9214725
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@57e2d15c
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@3ed1446c
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@9214725
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.258 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@57e2d15c
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@467625e3
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@2b8a11d7
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@4c32aa7e
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@49e54217
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@2972b493
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@2620264e
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.259 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.260 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.260 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@209baf91
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@9214725
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.264 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@57e2d15c
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@467625e3
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.265 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@2b8a11d7
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@4c32aa7e
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@49e54217
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@2972b493
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@2620264e
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.276 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@386ec37d
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@520e6089
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@9214725
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@57e2d15c
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@467625e3
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@2b8a11d7
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@4c32aa7e
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@49e54217
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.277 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@2972b493
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@2620264e
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@4d65fbad
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@5cb654e3
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@9214725
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.278 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@39cf7f3c
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.362 INFO 1 --- [ool-2-worker-29] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values:
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | allow.auto.create.topics = true
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | auto.commit.interval.ms = 5000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | auto.offset.reset = latest
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | bootstrap.servers = [kafka-broker:9092]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | check.crcs = true
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | client.dns.lookup = default
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | client.id =
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | client.rack =
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | connections.max.idle.ms = 540000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | default.api.timeout.ms = 60000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | enable.auto.commit = false
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | exclude.internal.topics = true
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | fetch.max.bytes = 52428800
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | fetch.max.wait.ms = 500
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | fetch.min.bytes = 1
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | group.id = realtime-group
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | group.instance.id = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | heartbeat.interval.ms = 3000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | interceptor.classes = []
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | internal.leave.group.on.close = true
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | isolation.level = read_uncommitted
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | max.partition.fetch.bytes = 1048576
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | max.poll.interval.ms = 300000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | max.poll.records = 500
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | metadata.max.age.ms = 300000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | metric.reporters = []
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | metrics.num.samples = 2
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | metrics.recording.level = INFO
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | metrics.sample.window.ms = 30000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | receive.buffer.bytes = 65536
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | reconnect.backoff.max.ms = 1000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | reconnect.backoff.ms = 50
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | request.timeout.ms = 30000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | retry.backoff.ms = 100
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.client.callback.handler.class = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.jaas.config = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.kerberos.kinit.cmd = /usr/bin/kinit
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.kerberos.min.time.before.relogin = 60000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.kerberos.service.name = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.kerberos.ticket.renew.jitter = 0.05
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.kerberos.ticket.renew.window.factor = 0.8
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.login.callback.handler.class = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.login.class = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.login.refresh.buffer.seconds = 300
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.login.refresh.min.period.seconds = 60
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.login.refresh.window.factor = 0.8
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.login.refresh.window.jitter = 0.05
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | sasl.mechanism = GSSAPI
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | security.protocol = PLAINTEXT
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | send.buffer.bytes = 131072
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | session.timeout.ms = 10000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.cipher.suites = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.endpoint.identification.algorithm = https
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.key.password = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.keymanager.algorithm = SunX509
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.keystore.location = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.keystore.password = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.keystore.type = JKS
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.protocol = TLS
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.provider = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.secure.random.implementation = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.trustmanager.algorithm = PKIX
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.truststore.location = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.truststore.password = null
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | ssl.truststore.type = JKS
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | value.deserializer = class com.tecnositaf.dg.processor.model.serde.IncomingDataDeserializer
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.430 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.3.1
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.430 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 18a913733fb71c01
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.430 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1608051780429
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.433 INFO 1 --- [ool-2-worker-29] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer-1, groupId=realtime-group] Subscribed to topic(s): datagenerator-producer-topic
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.530 INFO 1 --- [ool-2-worker-29] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-1, groupId=realtime-group] Cluster ID: A6TeuCvYQcyPFIxM9OVJ0g
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.531 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Discovered group coordinator kafka-broker:9092 (id: 2147483646 rack: null)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.533 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Revoking previously assigned partitions []
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.533 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] (Re-)joining group
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:00.538 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] (Re-)joining group
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.545 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Successfully joined group with generation 17
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.548 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Setting newly assigned partitions: datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.553 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Found no committed offset for partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.559 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.560 INFO 1 --- [streaming-start] o.a.spark.streaming.util.RecurringTimer : Started timer for JobGenerator at time 1608051782000
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.560 INFO 1 --- [streaming-start] o.a.s.streaming.scheduler.JobGenerator : Started JobGenerator at 1608051782000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.560 INFO 1 --- [streaming-start] o.a.s.streaming.scheduler.JobScheduler : Started JobScheduler
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.563 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@381db99e{/streaming,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.563 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@58545c2e{/streaming/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.563 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3681c285{/streaming/batch,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.564 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6894d28d{/streaming/batch/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.565 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@21139739{/static/streaming,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.565 INFO 1 --- [ main] o.a.spark.streaming.StreamingContext : StreamingContext started
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.567 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Seeking to LATEST offset of partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.569 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.583 INFO 1 --- [ JobGenerator] o.a.s.s.d.InternalMapWithStateDStream : Time 1608051780000 ms is invalid as zeroTime is 1608051780000 ms , slideDuration is 2000 ms and difference is 0 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.603 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobScheduler : Added jobs for time 1608051782000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.604 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobGenerator : Checkpointing graph for time 1608051782000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.604 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updating checkpoint data for time 1608051782000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.604 INFO 1 --- [ JobScheduler] o.a.s.streaming.scheduler.JobScheduler : Starting job streaming job 1608051782000 ms.0 from job set of time 1608051782000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.606 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updated checkpoint data for time 1608051782000 ms
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.611 INFO 1 --- [ JobGenerator] o.a.spark.streaming.CheckpointWriter : Submitted checkpoint of time 1608051782000 ms to writer queue
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.611 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051782000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051782000'
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.619 INFO 1 --- [-job-executor-0] org.apache.spark.SparkContext : Starting job: isEmpty at RealtimeProcessor.java:302
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.626 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Got job 0 (isEmpty at RealtimeProcessor.java:302) with 1 output partitions
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.627 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Final stage: ResultStage 0 (isEmpty at RealtimeProcessor.java:302)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.627 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Parents of final stage: List()
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.628 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Missing parents: List()
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.630 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting ResultStage 0 (MapPartitionsRDD[1] at mapPartitionsWithIndex at RealtimeProcessor.java:205), which has no missing parents
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.642 INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1 stored as values in memory (estimated size 15.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.644 INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1_piece0 stored as bytes in memory (estimated size 6.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.644 INFO 1 --- [er-event-loop-5] o.apache.spark.storage.BlockManagerInfo : Added broadcast_1_piece0 in memory on 25f0969edcf9:38871 (size: 6.7 KB, free: 9.2 GB)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.645 INFO 1 --- [uler-event-loop] org.apache.spark.SparkContext : Created broadcast 1 from broadcast at DAGScheduler.scala:1163
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.652 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at mapPartitionsWithIndex at RealtimeProcessor.java:205) (first 15 tasks are for partitions Vector(0))
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.653 INFO 1 --- [uler-event-loop] o.a.spark.scheduler.TaskSchedulerImpl : Adding task set 0.0 with 1 tasks
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.658 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.658 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062840_322016
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.661 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.668 WARN 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | . ____ _ __ _ _
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | \\/ ___)| |_)| | | | | || (_| | ) ) ) )
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ' |____| .__|_| |_|_| |_\__, | / / / /
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | =========|_|==============|___/=/_/_/_/
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | :: Spring Boot :: (v2.2.5.RELEASE)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:10.927 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : Starting DangerousGoodProcessor v0.0.1-SNAPSHOT on 909dfb133bfc with PID 1 (/app/classpath/dangerous-good-processor.original.jar started by root in /)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:10.928 INFO 1 --- [ main] c.t.dg.processor.DangerousGoodProcessor : No active profile set, falling back to default profiles: default
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.192 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Reactive MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.288 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 92ms. Found 0 Reactive MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.291 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data MongoDB repositories in DEFAULT mode.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.305 INFO 1 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 14ms. Found 1 MongoDB repository interfaces.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | WARNING: An illegal reflective access operation has occurred
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/app/libs/spark-unsafe_2.11-2.4.5.jar) to method java.nio.Bits.unaligned()
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | WARNING: All illegal access operations will be denied in a future release
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.685 INFO 1 --- [ main] org.apache.spark.SparkContext : Running Spark version 2.4.5
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.837 WARN 1 --- [ main] org.apache.hadoop.util.NativeCodeLoader : Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.914 INFO 1 --- [ main] org.apache.spark.SparkContext : Submitted application: dangerous-good-realtime-processor
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.943 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls to: root
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.943 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls to: root
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.943 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing view acls groups to:
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.943 INFO 1 --- [ main] org.apache.spark.SecurityManager : Changing modify acls groups to:
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:11.943 INFO 1 --- [ main] org.apache.spark.SecurityManager : SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set()
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.140 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'sparkDriver' on port 46059.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.152 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering MapOutputTracker
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.161 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering BlockManagerMaster
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.163 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.163 INFO 1 --- [ main] o.a.s.s.BlockManagerMasterEndpoint : BlockManagerMasterEndpoint up
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.173 INFO 1 --- [ main] o.apache.spark.storage.DiskBlockManager : Created local directory at /tmp/blockmgr-c3375850-4f86-4bf7-8838-2973e9be63a3
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.182 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : MemoryStore started with capacity 9.2 GB
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.189 INFO 1 --- [ main] org.apache.spark.SparkEnv : Registering OutputCommitCoordinator
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.223 INFO 1 --- [ main] org.spark_project.jetty.util.log : Logging initialized @1847ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.251 INFO 1 --- [ main] org.spark_project.jetty.server.Server : jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.260 INFO 1 --- [ main] org.spark_project.jetty.server.Server : Started @1884ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.268 INFO 1 --- [ main] o.s.jetty.server.AbstractConnector : Started ServerConnector@aca2fc3{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.269 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'SparkUI' on port 4040.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.279 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7493d937{/jobs,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.279 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1b3bec6c{/jobs/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.280 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@53f7a906{/jobs/job,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.281 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4c2ab536{/jobs/job/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.281 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@22fb9a2c{/stages,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.281 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2084e65a{/stages/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.282 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1b8fa2fa{/stages/stage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.283 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@360d41d0{/stages/stage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.283 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4586a8aa{/stages/pool,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.284 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3dc82e6a{/stages/pool/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.284 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@54463380{/storage,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.285 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@819fb19{/storage/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.285 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@782ac148{/storage/rdd,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.285 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@46d1b59{/storage/rdd/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.286 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@61d2f267{/environment,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.287 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@26d40c77{/environment/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.287 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@85ab964{/executors,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.288 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6481dce5{/executors/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.288 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@12365bd8{/executors/threadDump,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.289 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@14874a5d{/executors/threadDump/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.292 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@49cb5030{/static,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.293 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5a06904{/,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.294 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@fabef2e{/api,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.294 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@286866cb{/jobs/job/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.295 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@ce561cc{/stages/stage/kill,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.296 INFO 1 --- [ main] org.apache.spark.ui.SparkUI : Bound SparkUI to 0.0.0.0, and started at http://909dfb133bfc:4040
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.307 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-databind-2.10.2.jar at spark://909dfb133bfc:46059/jars/jackson-databind-2.10.2.jar with timestamp 1608051732307
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.307 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-api-2.3.1.jar at spark://909dfb133bfc:46059/jars/connect-api-2.3.1.jar with timestamp 1608051732307
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.307 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-logging-2.2.5.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-boot-starter-logging-2.2.5.RELEASE.jar with timestamp 1608051732307
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.307 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jetty-util-6.1.26.jar at spark://909dfb133bfc:46059/jars/jetty-util-6.1.26.jar with timestamp 1608051732307
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.308 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-asl-1.9.13.jar at spark://909dfb133bfc:46059/jars/jackson-core-asl-1.9.13.jar with timestamp 1608051732308
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.308 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-util-1.0.0-M20.jar at spark://909dfb133bfc:46059/jars/api-util-1.0.0-M20.jar with timestamp 1608051732308
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.308 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-parser-combinators_2.11-1.1.0.jar at spark://909dfb133bfc:46059/jars/scala-parser-combinators_2.11-1.1.0.jar with timestamp 1608051732308
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.308 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-streams-2.3.1.jar at spark://909dfb133bfc:46059/jars/kafka-streams-2.3.1.jar with timestamp 1608051732308
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.308 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/RoaringBitmap-0.7.45.jar at spark://909dfb133bfc:46059/jars/RoaringBitmap-0.7.45.jar with timestamp 1608051732308
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.308 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-api-2.12.1.jar at spark://909dfb133bfc:46059/jars/log4j-api-2.12.1.jar with timestamp 1608051732308
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.308 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.activation-api-1.2.0.jar at spark://909dfb133bfc:46059/jars/javax.activation-api-1.2.0.jar with timestamp 1608051732308
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.308 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-core-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-mapreduce-client-core-2.6.5.jar with timestamp 1608051732308
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.309 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.annotation-api-1.3.5.jar at spark://909dfb133bfc:46059/jars/jakarta.annotation-api-1.3.5.jar with timestamp 1608051732308
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.309 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jaxb-api-2.3.1.jar at spark://909dfb133bfc:46059/jars/jaxb-api-2.3.1.jar with timestamp 1608051732309
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.309 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-graphite-4.1.3.jar at spark://909dfb133bfc:46059/jars/metrics-graphite-4.1.3.jar with timestamp 1608051732309
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.309 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-format-0.10.0.jar at spark://909dfb133bfc:46059/jars/arrow-format-0.10.0.jar with timestamp 1608051732309
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.309 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-core-1.2.3.jar at spark://909dfb133bfc:46059/jars/logback-core-1.2.3.jar with timestamp 1608051732309
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.309 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-io-common-1.16.0.jar at spark://909dfb133bfc:46059/jars/jts-io-common-1.16.0.jar with timestamp 1608051732309
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-common-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-yarn-common-2.6.5.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-autoconfigure-2.2.5.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-boot-autoconfigure-2.2.5.RELEASE.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-datatype-jdk8-2.10.2.jar at spark://909dfb133bfc:46059/jars/jackson-datatype-jdk8-2.10.2.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/joda-time-2.9.9.jar at spark://909dfb133bfc:46059/jars/joda-time-2.9.9.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/stream-2.7.0.jar at spark://909dfb133bfc:46059/jars/stream-2.7.0.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xmlenc-0.52.jar at spark://909dfb133bfc:46059/jars/xmlenc-0.52.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-math3-3.4.1.jar at spark://909dfb133bfc:46059/jars/commons-math3-3.4.1.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-mapreduce-1.5.5-nohive.jar at spark://909dfb133bfc:46059/jars/orc-mapreduce-1.5.5-nohive.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-boot-starter-data-mongodb-reactive-2.2.5.RELEASE.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.310 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-mongodb-2.2.5.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-data-mongodb-2.2.5.RELEASE.jar with timestamp 1608051732310
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-spark-connector_2.11-2.4.1.jar at spark://909dfb133bfc:46059/jars/mongo-spark-connector_2.11-2.4.1.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-to-slf4j-2.12.1.jar at spark://909dfb133bfc:46059/jars/log4j-to-slf4j-2.12.1.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-configuration-processor-2.2.5.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-boot-configuration-processor-2.2.5.RELEASE.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/j2objc-annotations-1.3.jar at spark://909dfb133bfc:46059/jars/j2objc-annotations-1.3.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-media-jaxb-2.29.1.jar at spark://909dfb133bfc:46059/jars/jersey-media-jaxb-2.29.1.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-ipc-1.8.2.jar at spark://909dfb133bfc:46059/jars/avro-ipc-1.8.2.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-common_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-network-common_2.11-2.4.5.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-common-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-common-2.6.5.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-reactivestreams-1.12.0.jar at spark://909dfb133bfc:46059/jars/mongodb-driver-reactivestreams-1.12.0.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.311 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-core-5.2.4.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-core-5.2.4.RELEASE.jar with timestamp 1608051732311
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jcl-over-slf4j-1.7.30.jar at spark://909dfb133bfc:46059/jars/jcl-over-slf4j-1.7.30.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-jobclient-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-mapreduce-client-jobclient-2.6.5.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zookeeper-3.4.6.jar at spark://909dfb133bfc:46059/jars/zookeeper-3.4.6.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compiler-3.0.8.jar at spark://909dfb133bfc:46059/jars/commons-compiler-3.0.8.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-reflect-2.11.12.jar at spark://909dfb133bfc:46059/jars/scala-reflect-2.11.12.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-format-2.4.0.jar at spark://909dfb133bfc:46059/jars/parquet-format-2.4.0.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-compress-1.8.1.jar at spark://909dfb133bfc:46059/jars/commons-compress-1.8.1.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/slf4j-api-1.7.30.jar at spark://909dfb133bfc:46059/jars/slf4j-api-1.7.30.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-i18n-2.0.0-M15.jar at spark://909dfb133bfc:46059/jars/apacheds-i18n-2.0.0-M15.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-data-commons-2.2.5.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-data-commons-2.2.5.RELEASE.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill_2.11-0.9.3.jar at spark://909dfb133bfc:46059/jars/chill_2.11-0.9.3.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/logback-classic-1.2.3.jar at spark://909dfb133bfc:46059/jars/logback-classic-1.2.3.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.312 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/antlr4-runtime-4.7.jar at spark://909dfb133bfc:46059/jars/antlr4-runtime-4.7.jar with timestamp 1608051732312
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-client-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-yarn-client-2.6.5.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hppc-0.7.2.jar at spark://909dfb133bfc:46059/jars/hppc-0.7.2.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/checker-compat-qual-2.5.5.jar at spark://909dfb133bfc:46059/jars/checker-compat-qual-2.5.5.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-2.2.5.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-boot-2.2.5.RELEASE.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-io-2.4.jar at spark://909dfb133bfc:46059/jars/commons-io-2.4.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-boot-starter-2.2.5.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-boot-starter-2.2.5.RELEASE.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-core-2.10.2.jar at spark://909dfb133bfc:46059/jars/jackson-core-2.10.2.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-jaxrs-1.9.13.jar at spark://909dfb133bfc:46059/jars/jackson-jaxrs-1.9.13.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-aop-5.2.4.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-aop-5.2.4.RELEASE.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.313 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/compress-lzf-1.0.3.jar at spark://909dfb133bfc:46059/jars/compress-lzf-1.0.3.jar with timestamp 1608051732313
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-kvstore_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-kvstore_2.11-2.4.5.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-mapred-1.8.2-hadoop2.jar at spark://909dfb133bfc:46059/jars/avro-mapred-1.8.2-hadoop2.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpcore-4.4.13.jar at spark://909dfb133bfc:46059/jars/httpcore-4.4.13.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-paranamer-2.10.2.jar at spark://909dfb133bfc:46059/jars/jackson-module-paranamer-2.10.2.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-crypto-1.0.0.jar at spark://909dfb133bfc:46059/jars/commons-crypto-1.0.0.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-configuration-1.6.jar at spark://909dfb133bfc:46059/jars/commons-configuration-1.6.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-annotations-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-annotations-2.6.5.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-unsafe_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-unsafe_2.11-2.4.5.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-httpclient-3.1.jar at spark://909dfb133bfc:46059/jars/commons-httpclient-3.1.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/flatbuffers-1.2.0-3f79e055.jar at spark://909dfb133bfc:46059/jars/flatbuffers-1.2.0-3f79e055.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar at spark://909dfb133bfc:46059/jars/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-jackson-1.10.1.jar at spark://909dfb133bfc:46059/jars/parquet-jackson-1.10.1.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-xc-1.9.13.jar at spark://909dfb133bfc:46059/jars/jackson-xc-1.9.13.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.314 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming-kafka-0-10_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-streaming-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051732314
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/ivy-2.4.0.jar at spark://909dfb133bfc:46059/jars/ivy-2.4.0.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snakeyaml-1.25.jar at spark://909dfb133bfc:46059/jars/snakeyaml-1.25.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/htrace-core-3.0.4.jar at spark://909dfb133bfc:46059/jars/htrace-core-3.0.4.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-annotations-2.10.2.jar at spark://909dfb133bfc:46059/jars/jackson-annotations-2.10.2.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-server-common-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-yarn-server-common-2.6.5.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-recipes-4.0.1.jar at spark://909dfb133bfc:46059/jars/curator-recipes-4.0.1.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/api-asn1-api-1.0.0-M20.jar at spark://909dfb133bfc:46059/jars/api-asn1-api-1.0.0-M20.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/activation-1.1.1.jar at spark://909dfb133bfc:46059/jars/activation-1.1.1.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/objenesis-2.6.jar at spark://909dfb133bfc:46059/jars/objenesis-2.6.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-hadoop-1.10.1.jar at spark://909dfb133bfc:46059/jars/parquet-hadoop-1.10.1.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.315 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/log4j-1.2.17.jar at spark://909dfb133bfc:46059/jars/log4j-1.2.17.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/leveldbjni-all-1.8.jar at spark://909dfb133bfc:46059/jars/leveldbjni-all-1.8.jar with timestamp 1608051732315
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/guava-28.2-android.jar at spark://909dfb133bfc:46059/jars/guava-28.2-android.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-streaming_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-streaming_2.11-2.4.5.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-core_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-core_2.11-2.4.5.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-cli-1.2.jar at spark://909dfb133bfc:46059/jars/commons-cli-1.2.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-network-shuffle_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-network-shuffle_2.11-2.4.5.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-digester-1.8.jar at spark://909dfb133bfc:46059/jars/commons-digester-1.8.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/univocity-parsers-2.7.3.jar at spark://909dfb133bfc:46059/jars/univocity-parsers-2.7.3.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-xml_2.11-1.0.6.jar at spark://909dfb133bfc:46059/jars/scala-xml_2.11-1.0.6.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xml-apis-1.3.04.jar at spark://909dfb133bfc:46059/jars/xml-apis-1.3.04.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-jcl-5.2.4.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-jcl-5.2.4.RELEASE.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-jvm-4.1.3.jar at spark://909dfb133bfc:46059/jars/metrics-jvm-4.1.3.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.316 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/oro-2.0.8.jar at spark://909dfb133bfc:46059/jars/oro-2.0.8.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/failureaccess-1.0.1.jar at spark://909dfb133bfc:46059/jars/failureaccess-1.0.1.jar with timestamp 1608051732316
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sketch_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-sketch_2.11-2.4.5.jar with timestamp 1608051732317
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/httpclient-4.5.11.jar at spark://909dfb133bfc:46059/jars/httpclient-4.5.11.jar with timestamp 1608051732317
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-column-1.10.1.jar at spark://909dfb133bfc:46059/jars/parquet-column-1.10.1.jar with timestamp 1608051732317
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.ws.rs-api-2.1.6.jar at spark://909dfb133bfc:46059/jars/jakarta.ws.rs-api-2.1.6.jar with timestamp 1608051732317
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-client-2.6.0.jar at spark://909dfb133bfc:46059/jars/curator-client-2.6.0.jar with timestamp 1608051732317
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/minlog-1.3.0.jar at spark://909dfb133bfc:46059/jars/minlog-1.3.0.jar with timestamp 1608051732317
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.inject-2.6.1.jar at spark://909dfb133bfc:46059/jars/jakarta.inject-2.6.1.jar with timestamp 1608051732317
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jts-core-1.16.0.jar at spark://909dfb133bfc:46059/jars/jts-core-1.16.0.jar with timestamp 1608051732317
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.317 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/unused-1.0.0.jar at spark://909dfb133bfc:46059/jars/unused-1.0.0.jar with timestamp 1608051732317
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/snappy-java-1.1.7.3.jar at spark://909dfb133bfc:46059/jars/snappy-java-1.1.7.3.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-encoding-1.10.1.jar at spark://909dfb133bfc:46059/jars/parquet-encoding-1.10.1.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-api-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-yarn-api-2.6.5.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-shuffle-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-mapreduce-client-shuffle-2.6.5.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-clients-2.3.1.jar at spark://909dfb133bfc:46059/jars/kafka-clients-2.3.1.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-1.8.2.jar at spark://909dfb133bfc:46059/jars/avro-1.8.2.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jul-to-slf4j-1.7.30.jar at spark://909dfb133bfc:46059/jars/jul-to-slf4j-1.7.30.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/gson-2.8.6.jar at spark://909dfb133bfc:46059/jars/gson-2.8.6.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-scala_2.11-2.10.2.jar at spark://909dfb133bfc:46059/jars/jackson-module-scala_2.11-2.10.2.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-net-3.1.jar at spark://909dfb133bfc:46059/jars/commons-net-3.1.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-json-2.3.1.jar at spark://909dfb133bfc:46059/jars/connect-json-2.3.1.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/protobuf-java-3.11.4.jar at spark://909dfb133bfc:46059/jars/protobuf-java-3.11.4.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.318 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-retry-1.2.5.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-retry-1.2.5.RELEASE.jar with timestamp 1608051732318
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactor-core-3.3.3.RELEASE.jar at spark://909dfb133bfc:46059/jars/reactor-core-3.3.3.RELEASE.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-3.11.2.jar at spark://909dfb133bfc:46059/jars/mongodb-driver-3.11.2.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill-java-0.9.3.jar at spark://909dfb133bfc:46059/jars/chill-java-0.9.3.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json-simple-1.1.1.jar at spark://909dfb133bfc:46059/jars/json-simple-1.1.1.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-expression-5.2.4.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-expression-5.2.4.RELEASE.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-async-3.11.2.jar at spark://909dfb133bfc:46059/jars/mongodb-driver-async-3.11.2.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.servlet-api-4.0.1.jar at spark://909dfb133bfc:46059/jars/javax.servlet-api-4.0.1.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-client-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-client-2.6.5.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-common-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-mapreduce-client-common-2.6.5.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-memory-0.10.0.jar at spark://909dfb133bfc:46059/jars/arrow-memory-0.10.0.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-mapper-asl-1.9.13.jar at spark://909dfb133bfc:46059/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-hdfs-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-hdfs-2.6.5.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.319 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/py4j-0.10.7.jar at spark://909dfb133bfc:46059/jars/py4j-0.10.7.jar with timestamp 1608051732319
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang3-3.9.jar at spark://909dfb133bfc:46059/jars/commons-lang3-3.9.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql-kafka-0-10_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-sql-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-messaging-5.2.4.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-messaging-5.2.4.RELEASE.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lz4-java-1.4.0.jar at spark://909dfb133bfc:46059/jars/lz4-java-1.4.0.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-scalap_2.11-3.5.3.jar at spark://909dfb133bfc:46059/jars/json4s-scalap_2.11-3.5.3.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-kerberos-codec-2.0.0-M15.jar at spark://909dfb133bfc:46059/jars/apacheds-kerberos-codec-2.0.0-M15.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang-2.6.jar at spark://909dfb133bfc:46059/jars/commons-lang-2.6.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/pyrolite-4.13.jar at spark://909dfb133bfc:46059/jars/pyrolite-4.13.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-catalyst_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-catalyst_2.11-2.4.5.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/aircompressor-0.10.jar at spark://909dfb133bfc:46059/jars/aircompressor-0.10.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-ast_2.11-3.5.3.jar at spark://909dfb133bfc:46059/jars/json4s-ast_2.11-3.5.3.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xercesImpl-2.9.1.jar at spark://909dfb133bfc:46059/jars/xercesImpl-2.9.1.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kryo-shaded-4.0.2.jar at spark://909dfb133bfc:46059/jars/kryo-shaded-4.0.2.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-app-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-mapreduce-client-app-2.6.5.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-json-4.1.3.jar at spark://909dfb133bfc:46059/jars/metrics-json-4.1.3.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.320 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-framework-4.0.1.jar at spark://909dfb133bfc:46059/jars/curator-framework-4.0.1.jar with timestamp 1608051732320
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-client-2.29.1.jar at spark://909dfb133bfc:46059/jars/jersey-client-2.29.1.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-2.1.jar at spark://909dfb133bfc:46059/jars/hamcrest-2.1.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-3.9.9.Final.jar at spark://909dfb133bfc:46059/jars/netty-3.9.9.Final.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-2.29.1.jar at spark://909dfb133bfc:46059/jars/jersey-container-servlet-2.29.1.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-sql_2.11-2.4.5.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-server-2.29.1.jar at spark://909dfb133bfc:46059/jars/jersey-server-2.29.1.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xbean-asm6-shaded-4.10.jar at spark://909dfb133bfc:46059/jars/xbean-asm6-shaded-4.10.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-launcher_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-launcher_2.11-2.4.5.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-beans-5.2.4.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-beans-5.2.4.RELEASE.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-context-5.2.4.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-context-5.2.4.RELEASE.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-library-2.11.12.jar at spark://909dfb133bfc:46059/jars/scala-library-2.11.12.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lombok-1.18.12.jar at spark://909dfb133bfc:46059/jars/lombok-1.18.12.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-kafka-2.3.6.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-kafka-2.3.6.RELEASE.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-java-driver-3.11.2.jar at spark://909dfb133bfc:46059/jars/mongo-java-driver-3.11.2.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-core-2.1.jar at spark://909dfb133bfc:46059/jars/hamcrest-core-2.1.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/shims-0.7.45.jar at spark://909dfb133bfc:46059/jars/shims-0.7.45.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.321 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-vector-0.10.0.jar at spark://909dfb133bfc:46059/jars/arrow-vector-0.10.0.jar with timestamp 1608051732321
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.validation-api-2.0.2.jar at spark://909dfb133bfc:46059/jars/jakarta.validation-api-2.0.2.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jsr305-1.3.9.jar at spark://909dfb133bfc:46059/jars/jsr305-1.3.9.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-core-2.29.1.jar at spark://909dfb133bfc:46059/jars/jersey-container-servlet-core-2.29.1.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-all-4.1.45.Final.jar at spark://909dfb133bfc:46059/jars/netty-all-4.1.45.Final.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-shims-1.5.5.jar at spark://909dfb133bfc:46059/jars/orc-shims-1.5.5.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/junit-4.12.jar at spark://909dfb133bfc:46059/jars/junit-4.12.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-tags_2.11-2.4.5.jar at spark://909dfb133bfc:46059/jars/spark-tags_2.11-2.4.5.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/bson-3.11.2.jar at spark://909dfb133bfc:46059/jars/bson-3.11.2.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/amqp-client-5.7.3.jar at spark://909dfb133bfc:46059/jars/amqp-client-5.7.3.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/rocksdbjni-5.18.3.jar at spark://909dfb133bfc:46059/jars/rocksdbjni-5.18.3.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-collections-3.2.2.jar at spark://909dfb133bfc:46059/jars/commons-collections-3.2.2.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-common-2.29.1.jar at spark://909dfb133bfc:46059/jars/jersey-common-2.29.1.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-codec-1.13.jar at spark://909dfb133bfc:46059/jars/commons-codec-1.13.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zstd-jni-1.3.2-2.jar at spark://909dfb133bfc:46059/jars/zstd-jni-1.3.2-2.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/janino-3.0.8.jar at spark://909dfb133bfc:46059/jars/janino-3.0.8.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-beanutils-1.7.0.jar at spark://909dfb133bfc:46059/jars/commons-beanutils-1.7.0.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-common-1.10.1.jar at spark://909dfb133bfc:46059/jars/parquet-common-1.10.1.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-core-3.11.2.jar at spark://909dfb133bfc:46059/jars/mongodb-driver-core-3.11.2.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.322 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactive-streams-1.0.3.jar at spark://909dfb133bfc:46059/jars/reactive-streams-1.0.3.jar with timestamp 1608051732322
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xz-1.5.jar at spark://909dfb133bfc:46059/jars/xz-1.5.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-jackson_2.11-3.5.3.jar at spark://909dfb133bfc:46059/jars/json4s-jackson_2.11-3.5.3.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-core-4.1.3.jar at spark://909dfb133bfc:46059/jars/metrics-core-4.1.3.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/osgi-resource-locator-1.0.3.jar at spark://909dfb133bfc:46059/jars/osgi-resource-locator-1.0.3.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-auth-2.6.5.jar at spark://909dfb133bfc:46059/jars/hadoop-auth-2.6.5.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-core_2.11-3.5.3.jar at spark://909dfb133bfc:46059/jars/json4s-core_2.11-3.5.3.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-tx-5.2.4.RELEASE.jar at spark://909dfb133bfc:46059/jars/spring-tx-5.2.4.RELEASE.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-core-1.5.5-nohive.jar at spark://909dfb133bfc:46059/jars/orc-core-1.5.5-nohive.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/error_prone_annotations-2.3.4.jar at spark://909dfb133bfc:46059/jars/error_prone_annotations-2.3.4.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/paranamer-2.8.jar at spark://909dfb133bfc:46059/jars/paranamer-2.8.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.323 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/classpath/dangerous-good-processor.original.jar at spark://909dfb133bfc:46059/jars/dangerous-good-processor.original.jar with timestamp 1608051732323
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.362 INFO 1 --- [er-threadpool-0] s.d.c.StandaloneAppClient$ClientEndpoint : Connecting to master spark://spark-master:7077...
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.391 INFO 1 --- [pc-connection-0] o.a.s.n.client.TransportClientFactory : Successfully created connection to spark-master/10.0.6.246:7077 after 17 ms (0 ms spent in bootstraps)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.447 INFO 1 --- [er-event-loop-3] o.a.s.s.c.StandaloneSchedulerBackend : Connected to Spark cluster with app ID app-20201215170212-0007
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.449 INFO 1 --- [er-event-loop-2] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20201215170212-0007/0 on worker-20201215165722-10.0.6.251-38117 (10.0.6.251:38117) with 2 core(s)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.450 INFO 1 --- [er-event-loop-2] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20201215170212-0007/0 on hostPort 10.0.6.251:38117 with 2 core(s), 1024.0 MB RAM
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.453 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 36401.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.453 INFO 1 --- [ main] o.a.s.n.netty.NettyBlockTransferService : Server created on 909dfb133bfc:36401
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.454 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.456 INFO 1 --- [er-event-loop-1] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20201215170212-0007/0 is now RUNNING
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.469 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registering BlockManager BlockManagerId(driver, 909dfb133bfc, 36401, None)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.471 INFO 1 --- [r-event-loop-15] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 909dfb133bfc:36401 with 9.2 GB RAM, BlockManagerId(driver, 909dfb133bfc, 36401, None)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.473 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registered BlockManager BlockManagerId(driver, 909dfb133bfc, 36401, None)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.473 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Initialized BlockManager: BlockManagerId(driver, 909dfb133bfc, 36401, None)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.558 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7717b4a0{/metrics/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.568 INFO 1 --- [ main] o.a.s.s.c.StandaloneSchedulerBackend : SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:12.878 WARN 1 --- [ main] org.apache.spark.SparkContext : Using an existing SparkContext; some configuration may not take effect.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.077 INFO 1 --- [ main] org.mongodb.driver.cluster : Cluster created with settings {hosts=[mongo-server:27017], mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500, requiredReplicaSetName='rs0'}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.077 INFO 1 --- [ main] org.mongodb.driver.cluster : Adding discovered server mongo-server:27017 to client view of cluster
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.120 INFO 1 --- [go-server:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:1, serverValue:26}] to mongo-server:27017
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.130 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-server:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=2538337, setName='rs0', canonicalAddress=mongo-server:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=7fffffff000000000000001d, setVersion=37028, lastWriteDate=Tue Dec 15 17:02:05 UTC 2020, lastUpdateTimeNanos=1845449658020932}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.131 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-0:27017 to client view of cluster
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.132 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-1:27017 to client view of cluster
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.133 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max election id to 7fffffff000000000000001d from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.133 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max set version to 37028 from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.133 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Discovered replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.135 INFO 1 --- [replica-0:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:2, serverValue:20}] to mongo-replica-0:27017
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.136 INFO 1 --- [replica-1:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:3, serverValue:19}] to mongo-replica-1:27017
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.141 INFO 1 --- [replica-1:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-1:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=4849991, setName='rs0', canonicalAddress=mongo-replica-1:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:02:05 UTC 2020, lastUpdateTimeNanos=1845449675741099}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.142 INFO 1 --- [replica-0:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-0:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=6240002, setName='rs0', canonicalAddress=mongo-replica-0:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:02:05 UTC 2020, lastUpdateTimeNanos=1845449676442300}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.413 INFO 1 --- [ main] org.mongodb.driver.connection : Opened connection [connectionId{localValue:4, serverValue:27}] to mongo-server:27017
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ===========================================================
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | PortalMetadata(id=5f51f7ecdf7c147d8529cc8d, portalName=P7, expiration=60, asset=com.tecnositaf.dg.processor.model.assets.Asset@b814e23)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.592 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0 stored as values in memory (estimated size 22.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.629 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.2 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.631 INFO 1 --- [r-event-loop-15] o.apache.spark.storage.BlockManagerInfo : Added broadcast_0_piece0 in memory on 909dfb133bfc:36401 (size: 2.2 KB, free: 9.2 GB)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.633 INFO 1 --- [ main] org.apache.spark.SparkContext : Created broadcast 0 from broadcast at RealtimeProcessor.java:81
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.643 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.643 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Warehouse path is 'file:/spark-warehouse'.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.649 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@729d1428{/SQL,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.649 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@115924ba{/SQL/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.649 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@67afc9b6{/SQL/execution,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.650 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@36ce9eaf{/SQL/execution/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.651 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5039c2cf{/static/sql,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:13.903 INFO 1 --- [er-event-loop-6] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.6.251:58626) with ID 0
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.037 INFO 1 --- [ main] o.a.s.s.e.s.s.StateStoreCoordinatorRef : Registered StateStoreCoordinator endpoint
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.084 INFO 1 --- [er-event-loop-4] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 10.0.6.251:44511 with 366.3 MB RAM, BlockManagerId(0, 10.0.6.251, 44511, None)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.274 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding enable.auto.commit to false for executor
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.274 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding auto.offset.reset to none for executor
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.274 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding executor group.id to spark-executor-realtime-group
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.274 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding receive.buffer.bytes to 65536 see KAFKA-3135
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | starting dangerous good processor
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.486 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.ShuffledDStream@533165e7
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.486 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.MappedDStream@157ff8f8
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.486 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.TransformedDStream@5a0b925a
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.486 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@a642c54
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.488 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.488 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.488 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.488 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@a642c54
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@5a0b925a
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@a703934
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@a642c54
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@5a0b925a
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@157ff8f8
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@533165e7
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.489 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@178f5b5b
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@125ace20
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@130a99fe
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@1d642682
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@53b81a55
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@a642c54
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@5a0b925a
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@157ff8f8
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@533165e7
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.490 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@178f5b5b
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@125ace20
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@130a99fe
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@1d642682
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@6810465
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/protobuf-java-3.11.4.jar at spark://034ba3310158:41881/jars/protobuf-java-3.11.4.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-retry-1.2.5.RELEASE.jar at spark://034ba3310158:41881/jars/spring-retry-1.2.5.RELEASE.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactor-core-3.3.3.RELEASE.jar at spark://034ba3310158:41881/jars/reactor-core-3.3.3.RELEASE.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-3.11.2.jar at spark://034ba3310158:41881/jars/mongodb-driver-3.11.2.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill-java-0.9.3.jar at spark://034ba3310158:41881/jars/chill-java-0.9.3.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json-simple-1.1.1.jar at spark://034ba3310158:41881/jars/json-simple-1.1.1.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-expression-5.2.4.RELEASE.jar at spark://034ba3310158:41881/jars/spring-expression-5.2.4.RELEASE.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.776 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-async-3.11.2.jar at spark://034ba3310158:41881/jars/mongodb-driver-async-3.11.2.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.servlet-api-4.0.1.jar at spark://034ba3310158:41881/jars/javax.servlet-api-4.0.1.jar with timestamp 1608051650776
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-client-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-client-2.6.5.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-common-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-mapreduce-client-common-2.6.5.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-memory-0.10.0.jar at spark://034ba3310158:41881/jars/arrow-memory-0.10.0.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-mapper-asl-1.9.13.jar at spark://034ba3310158:41881/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-hdfs-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-hdfs-2.6.5.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/py4j-0.10.7.jar at spark://034ba3310158:41881/jars/py4j-0.10.7.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang3-3.9.jar at spark://034ba3310158:41881/jars/commons-lang3-3.9.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql-kafka-0-10_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-sql-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-messaging-5.2.4.RELEASE.jar at spark://034ba3310158:41881/jars/spring-messaging-5.2.4.RELEASE.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lz4-java-1.4.0.jar at spark://034ba3310158:41881/jars/lz4-java-1.4.0.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-scalap_2.11-3.5.3.jar at spark://034ba3310158:41881/jars/json4s-scalap_2.11-3.5.3.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.777 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-kerberos-codec-2.0.0-M15.jar at spark://034ba3310158:41881/jars/apacheds-kerberos-codec-2.0.0-M15.jar with timestamp 1608051650777
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang-2.6.jar at spark://034ba3310158:41881/jars/commons-lang-2.6.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/pyrolite-4.13.jar at spark://034ba3310158:41881/jars/pyrolite-4.13.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-catalyst_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-catalyst_2.11-2.4.5.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/aircompressor-0.10.jar at spark://034ba3310158:41881/jars/aircompressor-0.10.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-ast_2.11-3.5.3.jar at spark://034ba3310158:41881/jars/json4s-ast_2.11-3.5.3.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xercesImpl-2.9.1.jar at spark://034ba3310158:41881/jars/xercesImpl-2.9.1.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kryo-shaded-4.0.2.jar at spark://034ba3310158:41881/jars/kryo-shaded-4.0.2.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-app-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-mapreduce-client-app-2.6.5.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-json-4.1.3.jar at spark://034ba3310158:41881/jars/metrics-json-4.1.3.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-framework-4.0.1.jar at spark://034ba3310158:41881/jars/curator-framework-4.0.1.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-client-2.29.1.jar at spark://034ba3310158:41881/jars/jersey-client-2.29.1.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-2.1.jar at spark://034ba3310158:41881/jars/hamcrest-2.1.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.778 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-3.9.9.Final.jar at spark://034ba3310158:41881/jars/netty-3.9.9.Final.jar with timestamp 1608051650778
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-2.29.1.jar at spark://034ba3310158:41881/jars/jersey-container-servlet-2.29.1.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-sql_2.11-2.4.5.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-server-2.29.1.jar at spark://034ba3310158:41881/jars/jersey-server-2.29.1.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xbean-asm6-shaded-4.10.jar at spark://034ba3310158:41881/jars/xbean-asm6-shaded-4.10.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-launcher_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-launcher_2.11-2.4.5.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-beans-5.2.4.RELEASE.jar at spark://034ba3310158:41881/jars/spring-beans-5.2.4.RELEASE.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-context-5.2.4.RELEASE.jar at spark://034ba3310158:41881/jars/spring-context-5.2.4.RELEASE.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-library-2.11.12.jar at spark://034ba3310158:41881/jars/scala-library-2.11.12.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lombok-1.18.12.jar at spark://034ba3310158:41881/jars/lombok-1.18.12.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-kafka-2.3.6.RELEASE.jar at spark://034ba3310158:41881/jars/spring-kafka-2.3.6.RELEASE.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-java-driver-3.11.2.jar at spark://034ba3310158:41881/jars/mongo-java-driver-3.11.2.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.779 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-core-2.1.jar at spark://034ba3310158:41881/jars/hamcrest-core-2.1.jar with timestamp 1608051650779
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/shims-0.7.45.jar at spark://034ba3310158:41881/jars/shims-0.7.45.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-vector-0.10.0.jar at spark://034ba3310158:41881/jars/arrow-vector-0.10.0.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.validation-api-2.0.2.jar at spark://034ba3310158:41881/jars/jakarta.validation-api-2.0.2.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jsr305-1.3.9.jar at spark://034ba3310158:41881/jars/jsr305-1.3.9.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-core-2.29.1.jar at spark://034ba3310158:41881/jars/jersey-container-servlet-core-2.29.1.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-all-4.1.45.Final.jar at spark://034ba3310158:41881/jars/netty-all-4.1.45.Final.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-shims-1.5.5.jar at spark://034ba3310158:41881/jars/orc-shims-1.5.5.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/junit-4.12.jar at spark://034ba3310158:41881/jars/junit-4.12.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-tags_2.11-2.4.5.jar at spark://034ba3310158:41881/jars/spark-tags_2.11-2.4.5.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/bson-3.11.2.jar at spark://034ba3310158:41881/jars/bson-3.11.2.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/amqp-client-5.7.3.jar at spark://034ba3310158:41881/jars/amqp-client-5.7.3.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.780 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/rocksdbjni-5.18.3.jar at spark://034ba3310158:41881/jars/rocksdbjni-5.18.3.jar with timestamp 1608051650780
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-collections-3.2.2.jar at spark://034ba3310158:41881/jars/commons-collections-3.2.2.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-common-2.29.1.jar at spark://034ba3310158:41881/jars/jersey-common-2.29.1.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-codec-1.13.jar at spark://034ba3310158:41881/jars/commons-codec-1.13.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zstd-jni-1.3.2-2.jar at spark://034ba3310158:41881/jars/zstd-jni-1.3.2-2.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/janino-3.0.8.jar at spark://034ba3310158:41881/jars/janino-3.0.8.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-beanutils-1.7.0.jar at spark://034ba3310158:41881/jars/commons-beanutils-1.7.0.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-common-1.10.1.jar at spark://034ba3310158:41881/jars/parquet-common-1.10.1.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-core-3.11.2.jar at spark://034ba3310158:41881/jars/mongodb-driver-core-3.11.2.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactive-streams-1.0.3.jar at spark://034ba3310158:41881/jars/reactive-streams-1.0.3.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xz-1.5.jar at spark://034ba3310158:41881/jars/xz-1.5.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.781 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-jackson_2.11-3.5.3.jar at spark://034ba3310158:41881/jars/json4s-jackson_2.11-3.5.3.jar with timestamp 1608051650781
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-core-4.1.3.jar at spark://034ba3310158:41881/jars/metrics-core-4.1.3.jar with timestamp 1608051650782
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/osgi-resource-locator-1.0.3.jar at spark://034ba3310158:41881/jars/osgi-resource-locator-1.0.3.jar with timestamp 1608051650782
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-auth-2.6.5.jar at spark://034ba3310158:41881/jars/hadoop-auth-2.6.5.jar with timestamp 1608051650782
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-core_2.11-3.5.3.jar at spark://034ba3310158:41881/jars/json4s-core_2.11-3.5.3.jar with timestamp 1608051650782
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.782 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-tx-5.2.4.RELEASE.jar at spark://034ba3310158:41881/jars/spring-tx-5.2.4.RELEASE.jar with timestamp 1608051650782
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-core-1.5.5-nohive.jar at spark://034ba3310158:41881/jars/orc-core-1.5.5-nohive.jar with timestamp 1608051650784
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/error_prone_annotations-2.3.4.jar at spark://034ba3310158:41881/jars/error_prone_annotations-2.3.4.jar with timestamp 1608051650784
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/paranamer-2.8.jar at spark://034ba3310158:41881/jars/paranamer-2.8.jar with timestamp 1608051650784
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.784 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/classpath/dangerous-good-processor.original.jar at spark://034ba3310158:41881/jars/dangerous-good-processor.original.jar with timestamp 1608051650784
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.855 INFO 1 --- [er-threadpool-0] s.d.c.StandaloneAppClient$ClientEndpoint : Connecting to master spark://spark-master:7077...
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:50.912 INFO 1 --- [pc-connection-0] o.a.s.n.client.TransportClientFactory : Successfully created connection to spark-master/10.0.6.246:7077 after 33 ms (0 ms spent in bootstraps)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.037 INFO 1 --- [er-event-loop-2] o.a.s.s.c.StandaloneSchedulerBackend : Connected to Spark cluster with app ID app-20201215170051-0005
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.039 INFO 1 --- [r-event-loop-13] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20201215170051-0005/0 on worker-20201215165722-10.0.6.251-38117 (10.0.6.251:38117) with 2 core(s)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.040 INFO 1 --- [r-event-loop-13] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20201215170051-0005/0 on hostPort 10.0.6.251:38117 with 2 core(s), 1024.0 MB RAM
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.061 INFO 1 --- [r-event-loop-10] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20201215170051-0005/0 is now RUNNING
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.069 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 32843.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.070 INFO 1 --- [ main] o.a.s.n.netty.NettyBlockTransferService : Server created on 034ba3310158:32843
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.070 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.085 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registering BlockManager BlockManagerId(driver, 034ba3310158, 32843, None)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.089 INFO 1 --- [r-event-loop-14] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 034ba3310158:32843 with 9.2 GB RAM, BlockManagerId(driver, 034ba3310158, 32843, None)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.091 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registered BlockManager BlockManagerId(driver, 034ba3310158, 32843, None)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.091 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Initialized BlockManager: BlockManagerId(driver, 034ba3310158, 32843, None)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.218 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@794cb26b{/metrics/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.244 INFO 1 --- [ main] o.a.s.s.c.StandaloneSchedulerBackend : SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:51.708 WARN 1 --- [ main] org.apache.spark.SparkContext : Using an existing SparkContext; some configuration may not take effect.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.011 INFO 1 --- [ main] org.mongodb.driver.cluster : Cluster created with settings {hosts=[mongo-server:27017], mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500, requiredReplicaSetName='rs0'}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.011 INFO 1 --- [ main] org.mongodb.driver.cluster : Adding discovered server mongo-server:27017 to client view of cluster
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.087 INFO 1 --- [go-server:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:1, serverValue:22}] to mongo-server:27017
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.100 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-server:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=2685616, setName='rs0', canonicalAddress=mongo-server:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=7fffffff000000000000001d, setVersion=37028, lastWriteDate=Tue Dec 15 17:00:45 UTC 2020, lastUpdateTimeNanos=1845368625031453}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.110 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-0:27017 to client view of cluster
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.112 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-1:27017 to client view of cluster
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.115 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max election id to 7fffffff000000000000001d from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.115 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max set version to 37028 from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.115 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Discovered replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.118 INFO 1 --- [replica-0:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:2, serverValue:18}] to mongo-replica-0:27017
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.127 INFO 1 --- [replica-1:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:3, serverValue:17}] to mongo-replica-1:27017
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.131 INFO 1 --- [replica-1:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-1:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=2695943, setName='rs0', canonicalAddress=mongo-replica-1:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:00:45 UTC 2020, lastUpdateTimeNanos=1845368665308550}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.139 INFO 1 --- [replica-0:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-0:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=20222029, setName='rs0', canonicalAddress=mongo-replica-0:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:00:45 UTC 2020, lastUpdateTimeNanos=1845368673685192}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.548 INFO 1 --- [ main] org.mongodb.driver.connection : Opened connection [connectionId{localValue:4, serverValue:23}] to mongo-server:27017
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ===========================================================
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | PortalMetadata(id=5f51f7ecdf7c147d8529cc8d, portalName=P7, expiration=60, asset=com.tecnositaf.dg.processor.model.assets.Asset@2d3768ce)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.825 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0 stored as values in memory (estimated size 22.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.873 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.2 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.875 INFO 1 --- [er-event-loop-4] o.apache.spark.storage.BlockManagerInfo : Added broadcast_0_piece0 in memory on 034ba3310158:32843 (size: 2.2 KB, free: 9.2 GB)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.878 INFO 1 --- [ main] org.apache.spark.SparkContext : Created broadcast 0 from broadcast at RealtimeProcessor.java:81
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.894 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.894 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Warehouse path is 'file:/spark-warehouse'.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.909 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@467421cc{/SQL,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.910 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@eb09112{/SQL/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.910 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@4629dde5{/SQL/execution,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.911 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7e35d743{/SQL/execution/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:52.912 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2267b0bb{/static/sql,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.118 INFO 1 --- [er-event-loop-0] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.6.251:60404) with ID 0
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.227 INFO 1 --- [ main] o.a.s.s.e.s.s.StateStoreCoordinatorRef : Registered StateStoreCoordinator endpoint
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.228 INFO 1 --- [er-event-loop-4] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 10.0.6.251:35309 with 366.3 MB RAM, BlockManagerId(0, 10.0.6.251, 35309, None)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.389 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding enable.auto.commit to false for executor
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.390 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding auto.offset.reset to none for executor
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.390 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding executor group.id to spark-executor-realtime-group
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.390 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding receive.buffer.bytes to 65536 see KAFKA-3135
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | starting dangerous good processor
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.574 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.ShuffledDStream@321bf4b1
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.574 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.MappedDStream@5bbf8daa
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.574 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.TransformedDStream@326489c4
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.574 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@67e12e28
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.576 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.577 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.577 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.577 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@67e12e28
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@326489c4
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@34714012
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@67e12e28
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@326489c4
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@5bbf8daa
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.578 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@321bf4b1
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@13231e7b
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@5eaa6f76
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@3d41449a
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@79bb14d8
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.579 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@54af9cce
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@67e12e28
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.validation-api-2.0.2.jar at spark://5c1f7f9d3fca:44051/jars/jakarta.validation-api-2.0.2.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jsr305-1.3.9.jar at spark://5c1f7f9d3fca:44051/jars/jsr305-1.3.9.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-core-2.29.1.jar at spark://5c1f7f9d3fca:44051/jars/jersey-container-servlet-core-2.29.1.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-all-4.1.45.Final.jar at spark://5c1f7f9d3fca:44051/jars/netty-all-4.1.45.Final.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.288 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-shims-1.5.5.jar at spark://5c1f7f9d3fca:44051/jars/orc-shims-1.5.5.jar with timestamp 1608051693288
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/junit-4.12.jar at spark://5c1f7f9d3fca:44051/jars/junit-4.12.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-tags_2.11-2.4.5.jar at spark://5c1f7f9d3fca:44051/jars/spark-tags_2.11-2.4.5.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/bson-3.11.2.jar at spark://5c1f7f9d3fca:44051/jars/bson-3.11.2.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/amqp-client-5.7.3.jar at spark://5c1f7f9d3fca:44051/jars/amqp-client-5.7.3.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/rocksdbjni-5.18.3.jar at spark://5c1f7f9d3fca:44051/jars/rocksdbjni-5.18.3.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-collections-3.2.2.jar at spark://5c1f7f9d3fca:44051/jars/commons-collections-3.2.2.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-common-2.29.1.jar at spark://5c1f7f9d3fca:44051/jars/jersey-common-2.29.1.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-encoding-1.10.1.jar at spark://c899ec0ab125:34063/jars/parquet-encoding-1.10.1.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-yarn-api-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-yarn-api-2.6.5.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-shuffle-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-mapreduce-client-shuffle-2.6.5.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kafka-clients-2.3.1.jar at spark://c899ec0ab125:34063/jars/kafka-clients-2.3.1.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/avro-1.8.2.jar at spark://c899ec0ab125:34063/jars/avro-1.8.2.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jul-to-slf4j-1.7.30.jar at spark://c899ec0ab125:34063/jars/jul-to-slf4j-1.7.30.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/gson-2.8.6.jar at spark://c899ec0ab125:34063/jars/gson-2.8.6.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-module-scala_2.11-2.10.2.jar at spark://c899ec0ab125:34063/jars/jackson-module-scala_2.11-2.10.2.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.425 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-net-3.1.jar at spark://c899ec0ab125:34063/jars/commons-net-3.1.jar with timestamp 1608051818425
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/connect-json-2.3.1.jar at spark://c899ec0ab125:34063/jars/connect-json-2.3.1.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/protobuf-java-3.11.4.jar at spark://c899ec0ab125:34063/jars/protobuf-java-3.11.4.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-retry-1.2.5.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-retry-1.2.5.RELEASE.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactor-core-3.3.3.RELEASE.jar at spark://c899ec0ab125:34063/jars/reactor-core-3.3.3.RELEASE.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-3.11.2.jar at spark://c899ec0ab125:34063/jars/mongodb-driver-3.11.2.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/chill-java-0.9.3.jar at spark://c899ec0ab125:34063/jars/chill-java-0.9.3.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json-simple-1.1.1.jar at spark://c899ec0ab125:34063/jars/json-simple-1.1.1.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-expression-5.2.4.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-expression-5.2.4.RELEASE.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-async-3.11.2.jar at spark://c899ec0ab125:34063/jars/mongodb-driver-async-3.11.2.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.426 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/javax.servlet-api-4.0.1.jar at spark://c899ec0ab125:34063/jars/javax.servlet-api-4.0.1.jar with timestamp 1608051818426
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-client-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-client-2.6.5.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-common-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-mapreduce-client-common-2.6.5.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-memory-0.10.0.jar at spark://c899ec0ab125:34063/jars/arrow-memory-0.10.0.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jackson-mapper-asl-1.9.13.jar at spark://c899ec0ab125:34063/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-hdfs-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-hdfs-2.6.5.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/py4j-0.10.7.jar at spark://c899ec0ab125:34063/jars/py4j-0.10.7.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang3-3.9.jar at spark://c899ec0ab125:34063/jars/commons-lang3-3.9.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql-kafka-0-10_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-sql-kafka-0-10_2.11-2.4.5.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-messaging-5.2.4.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-messaging-5.2.4.RELEASE.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lz4-java-1.4.0.jar at spark://c899ec0ab125:34063/jars/lz4-java-1.4.0.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.427 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-scalap_2.11-3.5.3.jar at spark://c899ec0ab125:34063/jars/json4s-scalap_2.11-3.5.3.jar with timestamp 1608051818427
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/apacheds-kerberos-codec-2.0.0-M15.jar at spark://c899ec0ab125:34063/jars/apacheds-kerberos-codec-2.0.0-M15.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-lang-2.6.jar at spark://c899ec0ab125:34063/jars/commons-lang-2.6.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/pyrolite-4.13.jar at spark://c899ec0ab125:34063/jars/pyrolite-4.13.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-catalyst_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-catalyst_2.11-2.4.5.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/aircompressor-0.10.jar at spark://c899ec0ab125:34063/jars/aircompressor-0.10.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-ast_2.11-3.5.3.jar at spark://c899ec0ab125:34063/jars/json4s-ast_2.11-3.5.3.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xercesImpl-2.9.1.jar at spark://c899ec0ab125:34063/jars/xercesImpl-2.9.1.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/kryo-shaded-4.0.2.jar at spark://c899ec0ab125:34063/jars/kryo-shaded-4.0.2.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-mapreduce-client-app-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-mapreduce-client-app-2.6.5.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.428 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-json-4.1.3.jar at spark://c899ec0ab125:34063/jars/metrics-json-4.1.3.jar with timestamp 1608051818428
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/curator-framework-4.0.1.jar at spark://c899ec0ab125:34063/jars/curator-framework-4.0.1.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-client-2.29.1.jar at spark://c899ec0ab125:34063/jars/jersey-client-2.29.1.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-2.1.jar at spark://c899ec0ab125:34063/jars/hamcrest-2.1.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-3.9.9.Final.jar at spark://c899ec0ab125:34063/jars/netty-3.9.9.Final.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-2.29.1.jar at spark://c899ec0ab125:34063/jars/jersey-container-servlet-2.29.1.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-sql_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-sql_2.11-2.4.5.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-server-2.29.1.jar at spark://c899ec0ab125:34063/jars/jersey-server-2.29.1.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xbean-asm6-shaded-4.10.jar at spark://c899ec0ab125:34063/jars/xbean-asm6-shaded-4.10.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-launcher_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-launcher_2.11-2.4.5.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-beans-5.2.4.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-beans-5.2.4.RELEASE.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-context-5.2.4.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-context-5.2.4.RELEASE.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.429 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/scala-library-2.11.12.jar at spark://c899ec0ab125:34063/jars/scala-library-2.11.12.jar with timestamp 1608051818429
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/lombok-1.18.12.jar at spark://c899ec0ab125:34063/jars/lombok-1.18.12.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-kafka-2.3.6.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-kafka-2.3.6.RELEASE.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongo-java-driver-3.11.2.jar at spark://c899ec0ab125:34063/jars/mongo-java-driver-3.11.2.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hamcrest-core-2.1.jar at spark://c899ec0ab125:34063/jars/hamcrest-core-2.1.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/shims-0.7.45.jar at spark://c899ec0ab125:34063/jars/shims-0.7.45.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/arrow-vector-0.10.0.jar at spark://c899ec0ab125:34063/jars/arrow-vector-0.10.0.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jakarta.validation-api-2.0.2.jar at spark://c899ec0ab125:34063/jars/jakarta.validation-api-2.0.2.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jsr305-1.3.9.jar at spark://c899ec0ab125:34063/jars/jsr305-1.3.9.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-container-servlet-core-2.29.1.jar at spark://c899ec0ab125:34063/jars/jersey-container-servlet-core-2.29.1.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.430 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/netty-all-4.1.45.Final.jar at spark://c899ec0ab125:34063/jars/netty-all-4.1.45.Final.jar with timestamp 1608051818430
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-shims-1.5.5.jar at spark://c899ec0ab125:34063/jars/orc-shims-1.5.5.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/junit-4.12.jar at spark://c899ec0ab125:34063/jars/junit-4.12.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spark-tags_2.11-2.4.5.jar at spark://c899ec0ab125:34063/jars/spark-tags_2.11-2.4.5.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/bson-3.11.2.jar at spark://c899ec0ab125:34063/jars/bson-3.11.2.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/amqp-client-5.7.3.jar at spark://c899ec0ab125:34063/jars/amqp-client-5.7.3.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/rocksdbjni-5.18.3.jar at spark://c899ec0ab125:34063/jars/rocksdbjni-5.18.3.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-collections-3.2.2.jar at spark://c899ec0ab125:34063/jars/commons-collections-3.2.2.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/jersey-common-2.29.1.jar at spark://c899ec0ab125:34063/jars/jersey-common-2.29.1.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-codec-1.13.jar at spark://c899ec0ab125:34063/jars/commons-codec-1.13.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zstd-jni-1.3.2-2.jar at spark://c899ec0ab125:34063/jars/zstd-jni-1.3.2-2.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/janino-3.0.8.jar at spark://c899ec0ab125:34063/jars/janino-3.0.8.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.431 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-beanutils-1.7.0.jar at spark://c899ec0ab125:34063/jars/commons-beanutils-1.7.0.jar with timestamp 1608051818431
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.432 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-common-1.10.1.jar at spark://c899ec0ab125:34063/jars/parquet-common-1.10.1.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.432 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-core-3.11.2.jar at spark://c899ec0ab125:34063/jars/mongodb-driver-core-3.11.2.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.432 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactive-streams-1.0.3.jar at spark://c899ec0ab125:34063/jars/reactive-streams-1.0.3.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.432 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xz-1.5.jar at spark://c899ec0ab125:34063/jars/xz-1.5.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.432 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-jackson_2.11-3.5.3.jar at spark://c899ec0ab125:34063/jars/json4s-jackson_2.11-3.5.3.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.432 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-core-4.1.3.jar at spark://c899ec0ab125:34063/jars/metrics-core-4.1.3.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.432 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/osgi-resource-locator-1.0.3.jar at spark://c899ec0ab125:34063/jars/osgi-resource-locator-1.0.3.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.432 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-auth-2.6.5.jar at spark://c899ec0ab125:34063/jars/hadoop-auth-2.6.5.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.432 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-core_2.11-3.5.3.jar at spark://c899ec0ab125:34063/jars/json4s-core_2.11-3.5.3.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.433 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-tx-5.2.4.RELEASE.jar at spark://c899ec0ab125:34063/jars/spring-tx-5.2.4.RELEASE.jar with timestamp 1608051818432
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.433 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-core-1.5.5-nohive.jar at spark://c899ec0ab125:34063/jars/orc-core-1.5.5-nohive.jar with timestamp 1608051818433
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.433 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/error_prone_annotations-2.3.4.jar at spark://c899ec0ab125:34063/jars/error_prone_annotations-2.3.4.jar with timestamp 1608051818433
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.433 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/paranamer-2.8.jar at spark://c899ec0ab125:34063/jars/paranamer-2.8.jar with timestamp 1608051818433
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.433 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/classpath/dangerous-good-processor.original.jar at spark://c899ec0ab125:34063/jars/dangerous-good-processor.original.jar with timestamp 1608051818433
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.597 INFO 1 --- [er-threadpool-0] s.d.c.StandaloneAppClient$ClientEndpoint : Connecting to master spark://spark-master:7077...
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.756 INFO 1 --- [pc-connection-0] o.a.s.n.client.TransportClientFactory : Successfully created connection to spark-master/10.0.6.246:7077 after 94 ms (0 ms spent in bootstraps)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.937 INFO 1 --- [er-event-loop-3] o.a.s.s.c.StandaloneSchedulerBackend : Connected to Spark cluster with app ID app-20201215170338-0009
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.945 INFO 1 --- [er-event-loop-7] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20201215170338-0009/0 on worker-20201215165722-10.0.6.251-38117 (10.0.6.251:38117) with 2 core(s)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.946 INFO 1 --- [er-event-loop-7] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20201215170338-0009/0 on hostPort 10.0.6.251:38117 with 2 core(s), 1024.0 MB RAM
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.982 INFO 1 --- [r-event-loop-14] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20201215170338-0009/0 is now RUNNING
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.982 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 36271.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.983 INFO 1 --- [ main] o.a.s.n.netty.NettyBlockTransferService : Server created on c899ec0ab125:36271
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:38.984 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:39.001 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registering BlockManager BlockManagerId(driver, c899ec0ab125, 36271, None)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:39.004 INFO 1 --- [r-event-loop-15] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager c899ec0ab125:36271 with 9.2 GB RAM, BlockManagerId(driver, c899ec0ab125, 36271, None)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:39.015 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registered BlockManager BlockManagerId(driver, c899ec0ab125, 36271, None)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:39.016 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Initialized BlockManager: BlockManagerId(driver, c899ec0ab125, 36271, None)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:39.186 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@10166230{/metrics/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:39.215 INFO 1 --- [ main] o.a.s.s.c.StandaloneSchedulerBackend : SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:39.806 WARN 1 --- [ main] org.apache.spark.SparkContext : Using an existing SparkContext; some configuration may not take effect.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.256 INFO 1 --- [ main] org.mongodb.driver.cluster : Cluster created with settings {hosts=[mongo-server:27017], mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500, requiredReplicaSetName='rs0'}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.256 INFO 1 --- [ main] org.mongodb.driver.cluster : Adding discovered server mongo-server:27017 to client view of cluster
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.356 INFO 1 --- [go-server:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:1, serverValue:39}] to mongo-server:27017
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.378 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-server:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=7690689, setName='rs0', canonicalAddress=mongo-server:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=7fffffff000000000000001d, setVersion=37028, lastWriteDate=Tue Dec 15 17:03:35 UTC 2020, lastUpdateTimeNanos=1845536899957582}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.379 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-0:27017 to client view of cluster
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.388 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-1:27017 to client view of cluster
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.476 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max election id to 7fffffff000000000000001d from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.476 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max set version to 37028 from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.476 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Discovered replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.502 INFO 1 --- [replica-0:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:3, serverValue:35}] to mongo-replica-0:27017
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.512 INFO 1 --- [replica-1:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:2, serverValue:37}] to mongo-replica-1:27017
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.515 INFO 1 --- [replica-1:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-1:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=2158434, setName='rs0', canonicalAddress=mongo-replica-1:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:03:35 UTC 2020, lastUpdateTimeNanos=1845537049719159}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:40.525 INFO 1 --- [replica-0:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-0:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=22170256, setName='rs0', canonicalAddress=mongo-replica-0:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:03:35 UTC 2020, lastUpdateTimeNanos=1845537059727880}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:41.164 INFO 1 --- [ main] org.mongodb.driver.connection : Opened connection [connectionId{localValue:4, serverValue:40}] to mongo-server:27017
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:41.945 INFO 1 --- [r-event-loop-12] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.6.251:54920) with ID 0
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ===========================================================
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | PortalMetadata(id=5f51f7ecdf7c147d8529cc8d, portalName=P7, expiration=60, asset=com.tecnositaf.dg.processor.model.assets.Asset@73905dff)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.133 INFO 1 --- [er-event-loop-2] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 10.0.6.251:44125 with 366.3 MB RAM, BlockManagerId(0, 10.0.6.251, 44125, None)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.250 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0 stored as values in memory (estimated size 22.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.293 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.2 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.295 INFO 1 --- [er-event-loop-5] o.apache.spark.storage.BlockManagerInfo : Added broadcast_0_piece0 in memory on c899ec0ab125:36271 (size: 2.2 KB, free: 9.2 GB)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.299 INFO 1 --- [ main] org.apache.spark.SparkContext : Created broadcast 0 from broadcast at RealtimeProcessor.java:81
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.318 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.318 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Warehouse path is 'file:/spark-warehouse'.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.324 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@13edc1bc{/SQL,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.325 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@77604a86{/SQL/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.326 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@32b8992e{/SQL/execution,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.326 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3ef7fb04{/SQL/execution/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.327 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@1e37cb10{/static/sql,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.652 INFO 1 --- [ main] o.a.s.s.e.s.s.StateStoreCoordinatorRef : Registered StateStoreCoordinator endpoint
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.848 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding enable.auto.commit to false for executor
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.848 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding auto.offset.reset to none for executor
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.849 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding executor group.id to spark-executor-realtime-group
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:42.849 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding receive.buffer.bytes to 65536 see KAFKA-3135
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | starting dangerous good processor
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.051 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.ShuffledDStream@7594d189
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.051 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.MappedDStream@25865fca
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.051 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.TransformedDStream@20f94e9a
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.051 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@11b234aa
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.053 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.054 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.054 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.054 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.054 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@11b234aa
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.054 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.054 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.054 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.054 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@20f94e9a
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@184e5c44
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@11b234aa
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@20f94e9a
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@25865fca
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.055 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.057 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@7594d189
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.057 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@11e36e5c
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@236bb278
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@38a3c078
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@22781286
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@a642c54
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.491 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@5a0b925a
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@157ff8f8
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@533165e7
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@178f5b5b
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.492 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@125ace20
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@130a99fe
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@1d642682
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@455591ad
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.493 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@15a87fbb
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@a642c54
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.494 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@6ec9f664
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.551 INFO 1 --- [ool-2-worker-29] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values:
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | allow.auto.create.topics = true
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | auto.commit.interval.ms = 5000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | auto.offset.reset = latest
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | bootstrap.servers = [kafka-broker:9092]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | check.crcs = true
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | client.dns.lookup = default
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | client.id =
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | client.rack =
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | connections.max.idle.ms = 540000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | default.api.timeout.ms = 60000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | enable.auto.commit = false
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | exclude.internal.topics = true
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | fetch.max.bytes = 52428800
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | fetch.max.wait.ms = 500
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | fetch.min.bytes = 1
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | group.id = realtime-group
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | group.instance.id = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | heartbeat.interval.ms = 3000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | interceptor.classes = []
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | internal.leave.group.on.close = true
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | isolation.level = read_uncommitted
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | max.partition.fetch.bytes = 1048576
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | max.poll.interval.ms = 300000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | max.poll.records = 500
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | metadata.max.age.ms = 300000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | metric.reporters = []
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | metrics.num.samples = 2
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | metrics.recording.level = INFO
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | metrics.sample.window.ms = 30000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | receive.buffer.bytes = 65536
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | reconnect.backoff.max.ms = 1000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | reconnect.backoff.ms = 50
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | request.timeout.ms = 30000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | retry.backoff.ms = 100
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.client.callback.handler.class = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.jaas.config = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.kerberos.kinit.cmd = /usr/bin/kinit
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.kerberos.min.time.before.relogin = 60000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.kerberos.service.name = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.kerberos.ticket.renew.jitter = 0.05
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.kerberos.ticket.renew.window.factor = 0.8
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.login.callback.handler.class = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.login.class = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.login.refresh.buffer.seconds = 300
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.login.refresh.min.period.seconds = 60
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.login.refresh.window.factor = 0.8
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.login.refresh.window.jitter = 0.05
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | sasl.mechanism = GSSAPI
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | security.protocol = PLAINTEXT
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | send.buffer.bytes = 131072
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | session.timeout.ms = 10000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.cipher.suites = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.endpoint.identification.algorithm = https
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.key.password = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.keymanager.algorithm = SunX509
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.keystore.location = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.keystore.password = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.keystore.type = JKS
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.protocol = TLS
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.provider = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.secure.random.implementation = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.trustmanager.algorithm = PKIX
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.truststore.location = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.truststore.password = null
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | ssl.truststore.type = JKS
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | value.deserializer = class com.tecnositaf.dg.processor.model.serde.IncomingDataDeserializer
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.606 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.3.1
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.606 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 18a913733fb71c01
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.606 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1608051734605
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.608 INFO 1 --- [ool-2-worker-29] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer-1, groupId=realtime-group] Subscribed to topic(s): datagenerator-producer-topic
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.735 INFO 1 --- [ool-2-worker-29] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-1, groupId=realtime-group] Cluster ID: A6TeuCvYQcyPFIxM9OVJ0g
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.736 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Discovered group coordinator kafka-broker:9092 (id: 2147483646 rack: null)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.738 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Revoking previously assigned partitions []
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.738 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] (Re-)joining group
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:14.744 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] (Re-)joining group
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.751 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Successfully joined group with generation 15
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.753 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Setting newly assigned partitions: datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.764 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Found no committed offset for partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.775 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.780 INFO 1 --- [streaming-start] o.a.spark.streaming.util.RecurringTimer : Started timer for JobGenerator at time 1608051736000
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.781 INFO 1 --- [streaming-start] o.a.s.streaming.scheduler.JobGenerator : Started JobGenerator at 1608051736000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.781 INFO 1 --- [streaming-start] o.a.s.streaming.scheduler.JobScheduler : Started JobScheduler
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.783 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2eef43f5{/streaming,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.784 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2c10f9f3{/streaming/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.785 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7a925ac4{/streaming/batch,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.785 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@3b09582d{/streaming/batch/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.786 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@27a8bfa1{/static/streaming,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.786 INFO 1 --- [ main] o.a.spark.streaming.StreamingContext : StreamingContext started
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.810 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Seeking to LATEST offset of partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.817 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.862 INFO 1 --- [ JobGenerator] o.a.s.s.d.InternalMapWithStateDStream : Time 1608051734000 ms is invalid as zeroTime is 1608051734000 ms , slideDuration is 2000 ms and difference is 0 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.901 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobScheduler : Added jobs for time 1608051736000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.902 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobGenerator : Checkpointing graph for time 1608051736000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.902 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updating checkpoint data for time 1608051736000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.929 INFO 1 --- [ JobScheduler] o.a.s.streaming.scheduler.JobScheduler : Starting job streaming job 1608051736000 ms.0 from job set of time 1608051736000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.932 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updated checkpoint data for time 1608051736000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.946 INFO 1 --- [ JobGenerator] o.a.spark.streaming.CheckpointWriter : Submitted checkpoint of time 1608051736000 ms to writer queue
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:17.949 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051736000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051736000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.000 INFO 1 --- [-job-executor-0] org.apache.spark.SparkContext : Starting job: isEmpty at RealtimeProcessor.java:302
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.002 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Seeking to LATEST offset of partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.008 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.016 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Got job 0 (isEmpty at RealtimeProcessor.java:302) with 1 output partitions
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.017 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Final stage: ResultStage 0 (isEmpty at RealtimeProcessor.java:302)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.017 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Parents of final stage: List()
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.018 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Missing parents: List()
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.025 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobScheduler : Added jobs for time 1608051738000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.025 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobGenerator : Checkpointing graph for time 1608051738000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.027 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updating checkpoint data for time 1608051738000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.029 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting ResultStage 0 (MapPartitionsRDD[1] at mapPartitionsWithIndex at RealtimeProcessor.java:205), which has no missing parents
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.029 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updated checkpoint data for time 1608051738000 ms
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.031 INFO 1 --- [ JobGenerator] o.a.spark.streaming.CheckpointWriter : Submitted checkpoint of time 1608051738000 ms to writer queue
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.096 INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1 stored as values in memory (estimated size 15.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.097 INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1_piece0 stored as bytes in memory (estimated size 6.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.098 INFO 1 --- [r-event-loop-14] o.apache.spark.storage.BlockManagerInfo : Added broadcast_1_piece0 in memory on 909dfb133bfc:36401 (size: 6.7 KB, free: 9.2 GB)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.098 INFO 1 --- [uler-event-loop] org.apache.spark.SparkContext : Created broadcast 1 from broadcast at DAGScheduler.scala:1163
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.104 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.104 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062763_321939
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.106 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at mapPartitionsWithIndex at RealtimeProcessor.java:205) (first 15 tasks are for partitions Vector(0))
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.106 INFO 1 --- [uler-event-loop] o.a.spark.scheduler.TaskSchedulerImpl : Adding task set 0.0 with 1 tasks
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.108 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.116 WARN 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.117 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 1 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051736000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.117 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051736000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051736000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.122 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.122 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062764_321940
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.122 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.124 WARN 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.124 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 2 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051736000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.124 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051736000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051736000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.124 INFO 1 --- [er-event-loop-3] o.apache.spark.scheduler.TaskSetManager : Starting task 0.0 in stage 0.0 (TID 0, 10.0.6.251, executor 0, partition 0, PROCESS_LOCAL, 7771 bytes)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.129 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.129 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062765_321941
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.130 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.132 WARN 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.132 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 3 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051736000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.133 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Could not write checkpoint for time 1608051736000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051736000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.133 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051738000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051738000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.137 INFO 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.137 INFO 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062766_321942
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.138 INFO 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.139 WARN 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.140 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 1 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051738000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@326489c4
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.580 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.581 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.581 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.581 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.581 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@5bbf8daa
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.581 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.581 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.581 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.581 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.581 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@321bf4b1
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@13231e7b
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@5eaa6f76
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@3d41449a
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@79bb14d8
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.583 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.584 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.584 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@70799ca7
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@58d635de
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@67e12e28
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.587 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@326489c4
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@5bbf8daa
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@321bf4b1
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@13231e7b
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@5eaa6f76
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@3d41449a
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@79bb14d8
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@21bf1b1f
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@17a756db
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@67e12e28
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.588 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@7e474bd
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.638 INFO 1 --- [ool-2-worker-29] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values:
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | allow.auto.create.topics = true
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | auto.commit.interval.ms = 5000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | auto.offset.reset = latest
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | bootstrap.servers = [kafka-broker:9092]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | check.crcs = true
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | client.dns.lookup = default
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | client.id =
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | client.rack =
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | connections.max.idle.ms = 540000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | default.api.timeout.ms = 60000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | enable.auto.commit = false
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | exclude.internal.topics = true
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | fetch.max.bytes = 52428800
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | fetch.max.wait.ms = 500
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | fetch.min.bytes = 1
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | group.id = realtime-group
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | group.instance.id = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | heartbeat.interval.ms = 3000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | interceptor.classes = []
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | internal.leave.group.on.close = true
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | isolation.level = read_uncommitted
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | max.partition.fetch.bytes = 1048576
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | max.poll.interval.ms = 300000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | max.poll.records = 500
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | metadata.max.age.ms = 300000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | metric.reporters = []
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | metrics.num.samples = 2
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | metrics.recording.level = INFO
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | metrics.sample.window.ms = 30000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | receive.buffer.bytes = 65536
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | reconnect.backoff.max.ms = 1000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | reconnect.backoff.ms = 50
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | request.timeout.ms = 30000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | retry.backoff.ms = 100
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.client.callback.handler.class = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.jaas.config = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.kerberos.kinit.cmd = /usr/bin/kinit
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.kerberos.min.time.before.relogin = 60000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.kerberos.service.name = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.kerberos.ticket.renew.jitter = 0.05
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.kerberos.ticket.renew.window.factor = 0.8
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.login.callback.handler.class = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.login.class = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.login.refresh.buffer.seconds = 300
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.login.refresh.min.period.seconds = 60
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.login.refresh.window.factor = 0.8
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.login.refresh.window.jitter = 0.05
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | sasl.mechanism = GSSAPI
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | security.protocol = PLAINTEXT
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | send.buffer.bytes = 131072
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | session.timeout.ms = 10000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.cipher.suites = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.endpoint.identification.algorithm = https
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.key.password = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.keymanager.algorithm = SunX509
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.keystore.location = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.keystore.password = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.keystore.type = JKS
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.protocol = TLS
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.provider = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.secure.random.implementation = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.trustmanager.algorithm = PKIX
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.truststore.location = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.truststore.password = null
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | ssl.truststore.type = JKS
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | value.deserializer = class com.tecnositaf.dg.processor.model.serde.IncomingDataDeserializer
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.690 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.3.1
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.691 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 18a913733fb71c01
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.691 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1608051653690
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.692 INFO 1 --- [ool-2-worker-29] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer-1, groupId=realtime-group] Subscribed to topic(s): datagenerator-producer-topic
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.807 INFO 1 --- [ool-2-worker-29] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-1, groupId=realtime-group] Cluster ID: A6TeuCvYQcyPFIxM9OVJ0g
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.808 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Discovered group coordinator kafka-broker:9092 (id: 2147483646 rack: null)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.810 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Revoking previously assigned partitions []
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.815 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] (Re-)joining group
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:53.820 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] (Re-)joining group
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.828 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Successfully joined group with generation 11
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.831 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Setting newly assigned partitions: datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.839 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Found no committed offset for partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.854 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.855 INFO 1 --- [streaming-start] o.a.spark.streaming.util.RecurringTimer : Started timer for JobGenerator at time 1608051654000
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.855 INFO 1 --- [streaming-start] o.a.s.streaming.scheduler.JobGenerator : Started JobGenerator at 1608051654000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.856 INFO 1 --- [streaming-start] o.a.s.streaming.scheduler.JobScheduler : Started JobScheduler
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.867 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@781ecbac{/streaming,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.868 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@5cb654e3{/streaming/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.868 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6592a73b{/streaming/batch,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.869 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@14f11a56{/streaming/batch/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.869 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Seeking to LATEST offset of partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.870 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@543b855{/static/streaming,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.870 INFO 1 --- [ main] o.a.spark.streaming.StreamingContext : StreamingContext started
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.872 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.899 INFO 1 --- [ JobGenerator] o.a.s.s.d.InternalMapWithStateDStream : Time 1608051652000 ms is invalid as zeroTime is 1608051652000 ms , slideDuration is 2000 ms and difference is 0 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.931 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobScheduler : Added jobs for time 1608051654000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.932 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Seeking to LATEST offset of partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.934 INFO 1 --- [ JobScheduler] o.a.s.streaming.scheduler.JobScheduler : Starting job streaming job 1608051654000 ms.0 from job set of time 1608051654000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.934 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.950 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobScheduler : Added jobs for time 1608051656000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.951 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobGenerator : Checkpointing graph for time 1608051654000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.951 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updating checkpoint data for time 1608051654000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.954 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updated checkpoint data for time 1608051654000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.960 INFO 1 --- [-job-executor-0] org.apache.spark.SparkContext : Starting job: isEmpty at RealtimeProcessor.java:302
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.961 INFO 1 --- [ JobGenerator] o.a.spark.streaming.CheckpointWriter : Submitted checkpoint of time 1608051654000 ms to writer queue
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.961 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobGenerator : Checkpointing graph for time 1608051656000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.961 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updating checkpoint data for time 1608051656000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.962 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updated checkpoint data for time 1608051656000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.963 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051654000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051654000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.969 INFO 1 --- [ JobGenerator] o.a.spark.streaming.CheckpointWriter : Submitted checkpoint of time 1608051656000 ms to writer queue
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.975 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Got job 0 (isEmpty at RealtimeProcessor.java:302) with 1 output partitions
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.975 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Final stage: ResultStage 0 (isEmpty at RealtimeProcessor.java:302)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.976 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Parents of final stage: List()
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.977 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Missing parents: List()
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:56.997 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting ResultStage 0 (MapPartitionsRDD[1] at mapPartitionsWithIndex at RealtimeProcessor.java:205), which has no missing parents
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.023 INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1 stored as values in memory (estimated size 15.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.028 INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1_piece0 stored as bytes in memory (estimated size 6.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.029 INFO 1 --- [r-event-loop-12] o.apache.spark.storage.BlockManagerInfo : Added broadcast_1_piece0 in memory on 034ba3310158:32843 (size: 6.7 KB, free: 9.2 GB)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.029 INFO 1 --- [uler-event-loop] org.apache.spark.SparkContext : Created broadcast 1 from broadcast at DAGScheduler.scala:1163
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.042 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.043 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062622_321798
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.043 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at mapPartitionsWithIndex at RealtimeProcessor.java:205) (first 15 tasks are for partitions Vector(0))
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.043 INFO 1 --- [uler-event-loop] o.a.spark.scheduler.TaskSchedulerImpl : Adding task set 0.0 with 1 tasks
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.048 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.065 WARN 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-codec-1.13.jar at spark://5c1f7f9d3fca:44051/jars/commons-codec-1.13.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/zstd-jni-1.3.2-2.jar at spark://5c1f7f9d3fca:44051/jars/zstd-jni-1.3.2-2.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/janino-3.0.8.jar at spark://5c1f7f9d3fca:44051/jars/janino-3.0.8.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/commons-beanutils-1.7.0.jar at spark://5c1f7f9d3fca:44051/jars/commons-beanutils-1.7.0.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/parquet-common-1.10.1.jar at spark://5c1f7f9d3fca:44051/jars/parquet-common-1.10.1.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.289 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/mongodb-driver-core-3.11.2.jar at spark://5c1f7f9d3fca:44051/jars/mongodb-driver-core-3.11.2.jar with timestamp 1608051693289
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/reactive-streams-1.0.3.jar at spark://5c1f7f9d3fca:44051/jars/reactive-streams-1.0.3.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/xz-1.5.jar at spark://5c1f7f9d3fca:44051/jars/xz-1.5.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-jackson_2.11-3.5.3.jar at spark://5c1f7f9d3fca:44051/jars/json4s-jackson_2.11-3.5.3.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/metrics-core-4.1.3.jar at spark://5c1f7f9d3fca:44051/jars/metrics-core-4.1.3.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/osgi-resource-locator-1.0.3.jar at spark://5c1f7f9d3fca:44051/jars/osgi-resource-locator-1.0.3.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/hadoop-auth-2.6.5.jar at spark://5c1f7f9d3fca:44051/jars/hadoop-auth-2.6.5.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@4cb45048
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@1df8c7f4
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@11b234aa
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.058 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@20f94e9a
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@25865fca
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.059 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@7594d189
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@11e36e5c
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@236bb278
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@38a3c078
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@4cb45048
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.067 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@217abcc9
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@208ce928
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@11b234aa
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@20f94e9a
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@25865fca
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@7594d189
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Initialized and validated org.apache.spark.streaming.dstream.InternalMapWithStateDStream@11e36e5c
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.s.dstream.MapWithStateDStreamImpl : Initialized and validated org.apache.spark.streaming.dstream.MapWithStateDStreamImpl@236bb278
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.068 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@38a3c078
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Memory Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@4cb45048
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.FilteredDStream : Initialized and validated org.apache.spark.streaming.dstream.FilteredDStream@3299e315
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@2db07651
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@11b234aa
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.069 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@7ca0aa55
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.162 INFO 1 --- [ool-2-worker-29] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values:
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | allow.auto.create.topics = true
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | auto.commit.interval.ms = 5000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | auto.offset.reset = latest
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | bootstrap.servers = [kafka-broker:9092]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | check.crcs = true
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | client.dns.lookup = default
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | client.id =
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | client.rack =
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | connections.max.idle.ms = 540000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | default.api.timeout.ms = 60000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | enable.auto.commit = false
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | exclude.internal.topics = true
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | fetch.max.bytes = 52428800
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | fetch.max.wait.ms = 500
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | fetch.min.bytes = 1
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | group.id = realtime-group
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | group.instance.id = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | heartbeat.interval.ms = 3000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | interceptor.classes = []
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | internal.leave.group.on.close = true
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | isolation.level = read_uncommitted
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | max.partition.fetch.bytes = 1048576
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | max.poll.interval.ms = 300000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | max.poll.records = 500
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | metadata.max.age.ms = 300000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | metric.reporters = []
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | metrics.num.samples = 2
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | metrics.recording.level = INFO
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | metrics.sample.window.ms = 30000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | receive.buffer.bytes = 65536
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | reconnect.backoff.max.ms = 1000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | reconnect.backoff.ms = 50
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | request.timeout.ms = 30000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | retry.backoff.ms = 100
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.client.callback.handler.class = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.jaas.config = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.kerberos.kinit.cmd = /usr/bin/kinit
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.kerberos.min.time.before.relogin = 60000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.kerberos.service.name = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.kerberos.ticket.renew.jitter = 0.05
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.kerberos.ticket.renew.window.factor = 0.8
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.login.callback.handler.class = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.login.class = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.login.refresh.buffer.seconds = 300
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.login.refresh.min.period.seconds = 60
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.login.refresh.window.factor = 0.8
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.login.refresh.window.jitter = 0.05
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | sasl.mechanism = GSSAPI
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | security.protocol = PLAINTEXT
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | send.buffer.bytes = 131072
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | session.timeout.ms = 10000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.cipher.suites = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.endpoint.identification.algorithm = https
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.key.password = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.keymanager.algorithm = SunX509
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.keystore.location = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.keystore.password = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.keystore.type = JKS
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.protocol = TLS
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.provider = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.secure.random.implementation = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.trustmanager.algorithm = PKIX
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.truststore.location = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.truststore.password = null
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | ssl.truststore.type = JKS
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | value.deserializer = class com.tecnositaf.dg.processor.model.serde.IncomingDataDeserializer
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.238 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.3.1
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.238 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 18a913733fb71c01
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.238 INFO 1 --- [ool-2-worker-29] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1608051823237
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.240 INFO 1 --- [ool-2-worker-29] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer-1, groupId=realtime-group] Subscribed to topic(s): datagenerator-producer-topic
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.372 INFO 1 --- [ool-2-worker-29] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-1, groupId=realtime-group] Cluster ID: A6TeuCvYQcyPFIxM9OVJ0g
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.373 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Discovered group coordinator kafka-broker:9092 (id: 2147483646 rack: null)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.380 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Revoking previously assigned partitions []
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.380 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] (Re-)joining group
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:43.400 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] (Re-)joining group
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.408 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Successfully joined group with generation 19
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.411 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Setting newly assigned partitions: datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.426 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-1, groupId=realtime-group] Found no committed offset for partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.447 INFO 1 --- [ool-2-worker-29] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.447 INFO 1 --- [streaming-start] o.a.spark.streaming.util.RecurringTimer : Started timer for JobGenerator at time 1608051824000
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.448 INFO 1 --- [streaming-start] o.a.s.streaming.scheduler.JobGenerator : Started JobGenerator at 1608051824000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.448 INFO 1 --- [streaming-start] o.a.s.streaming.scheduler.JobScheduler : Started JobScheduler
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.450 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@87276c4{/streaming,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.451 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@38f4a641{/streaming/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.452 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@63bb52ea{/streaming/batch,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.452 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@6f5df147{/streaming/batch/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.453 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@318155b1{/static/streaming,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.453 INFO 1 --- [ main] o.a.spark.streaming.StreamingContext : StreamingContext started
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.455 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Seeking to LATEST offset of partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.460 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.505 INFO 1 --- [ JobGenerator] o.a.s.s.d.InternalMapWithStateDStream : Time 1608051822000 ms is invalid as zeroTime is 1608051822000 ms , slideDuration is 2000 ms and difference is 0 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.553 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobScheduler : Added jobs for time 1608051824000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.554 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Seeking to LATEST offset of partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.556 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.569 INFO 1 --- [ JobScheduler] o.a.s.streaming.scheduler.JobScheduler : Starting job streaming job 1608051824000 ms.0 from job set of time 1608051824000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.569 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobScheduler : Added jobs for time 1608051826000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.582 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobGenerator : Checkpointing graph for time 1608051824000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.588 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updating checkpoint data for time 1608051824000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.598 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updated checkpoint data for time 1608051824000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.621 INFO 1 --- [-job-executor-0] org.apache.spark.SparkContext : Starting job: isEmpty at RealtimeProcessor.java:302
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.632 INFO 1 --- [ JobGenerator] o.a.spark.streaming.CheckpointWriter : Submitted checkpoint of time 1608051824000 ms to writer queue
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.632 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobGenerator : Checkpointing graph for time 1608051826000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.632 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updating checkpoint data for time 1608051826000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.633 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updated checkpoint data for time 1608051826000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.637 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051824000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051824000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.638 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Got job 0 (isEmpty at RealtimeProcessor.java:302) with 1 output partitions
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.639 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Final stage: ResultStage 0 (isEmpty at RealtimeProcessor.java:302)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.639 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Parents of final stage: List()
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.640 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Missing parents: List()
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.648 INFO 1 --- [ JobGenerator] o.a.spark.streaming.CheckpointWriter : Submitted checkpoint of time 1608051826000 ms to writer queue
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.680 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting ResultStage 0 (MapPartitionsRDD[1] at mapPartitionsWithIndex at RealtimeProcessor.java:205), which has no missing parents
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.740 INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1 stored as values in memory (estimated size 15.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.742 INFO 1 --- [uler-event-loop] o.a.spark.storage.memory.MemoryStore : Block broadcast_1_piece0 stored as bytes in memory (estimated size 6.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.743 INFO 1 --- [r-event-loop-14] o.apache.spark.storage.BlockManagerInfo : Added broadcast_1_piece0 in memory on c899ec0ab125:36271 (size: 6.7 KB, free: 9.2 GB)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.744 INFO 1 --- [uler-event-loop] org.apache.spark.SparkContext : Created broadcast 1 from broadcast at DAGScheduler.scala:1163
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.757 INFO 1 --- [uler-event-loop] org.apache.spark.scheduler.DAGScheduler : Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at mapPartitionsWithIndex at RealtimeProcessor.java:205) (first 15 tasks are for partitions Vector(0))
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.758 INFO 1 --- [uler-event-loop] o.a.spark.scheduler.TaskSchedulerImpl : Adding task set 0.0 with 1 tasks
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.824 INFO 1 --- [r-event-loop-15] o.apache.spark.scheduler.TaskSetManager : Starting task 0.0 in stage 0.0 (TID 0, 10.0.6.251, executor 0, partition 0, PROCESS_LOCAL, 7771 bytes)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.851 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.851 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062908_322084
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.870 INFO 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.916 WARN 1 --- [ Thread-17] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.916 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 1 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051824000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.916 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051824000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051824000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.980 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.980 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062909_322085
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:46.996 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.003 WARN 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.003 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 2 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051824000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.003 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051824000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051824000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.009 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.009 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062910_322086
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.009 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.010 WARN 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.011 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 3 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051824000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.668 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 1 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051782000'
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.668 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051782000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051782000'
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.670 INFO 1 --- [r-event-loop-13] o.apache.spark.scheduler.TaskSetManager : Starting task 0.0 in stage 0.0 (TID 0, 10.0.6.251, executor 0, partition 0, PROCESS_LOCAL, 7771 bytes)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.673 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.673 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062841_322017
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.674 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.676 WARN 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.676 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 2 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051782000'
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.677 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051782000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051782000'
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.682 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.682 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062842_322018
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.682 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.684 WARN 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | 2020-12-15 17:03:03.684 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 3 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051782000'
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 |
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.rwc2rmqfn10r@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.140 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051738000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051738000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.143 INFO 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.143 INFO 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062767_321943
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.144 INFO 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.145 WARN 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.146 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 2 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051738000'
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 |
dg-stack_boot-spark-processor.1.ul7zakgvnuky@ns3170655 | 2020-12-15 17:02:18.146 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051738000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051738000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.070 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 1 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051654000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.070 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051654000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051654000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.071 INFO 1 --- [r-event-loop-13] o.apache.spark.scheduler.TaskSetManager : Starting task 0.0 in stage 0.0 (TID 0, 10.0.6.251, executor 0, partition 0, PROCESS_LOCAL, 7771 bytes)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.083 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.083 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062623_321799
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.086 INFO 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.088 WARN 1 --- [ Thread-19] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.088 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 2 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051654000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.089 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051654000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051654000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.103 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.103 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062624_321800
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.105 INFO 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.108 WARN 1 --- [ Thread-20] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.108 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 3 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051654000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.109 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Could not write checkpoint for time 1608051654000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051654000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.109 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051656000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051656000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.115 INFO 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.115 INFO 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062625_321801
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.118 INFO 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.123 WARN 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.124 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 1 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051656000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.126 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051656000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051656000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.131 INFO 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.132 INFO 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062626_321802
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.133 INFO 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.136 WARN 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.136 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 2 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051656000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.136 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051656000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051656000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.146 INFO 1 --- [ Thread-23] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.146 INFO 1 --- [ Thread-23] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062627_321803
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.147 INFO 1 --- [ Thread-23] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.148 WARN 1 --- [ Thread-23] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.149 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 3 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051656000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:57.149 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Could not write checkpoint for time 1608051656000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051656000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.001 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Seeking to LATEST offset of partition datagenerator-producer-topic-0
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.003 INFO 1 --- [ JobGenerator] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-1, groupId=realtime-group] Resetting offset for partition datagenerator-producer-topic-0 to offset 0.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.013 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobScheduler : Added jobs for time 1608051658000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.013 INFO 1 --- [ JobGenerator] o.a.s.streaming.scheduler.JobGenerator : Checkpointing graph for time 1608051658000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.013 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updating checkpoint data for time 1608051658000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.013 INFO 1 --- [ JobGenerator] org.apache.spark.streaming.DStreamGraph : Updated checkpoint data for time 1608051658000 ms
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.015 INFO 1 --- [ JobGenerator] o.a.spark.streaming.CheckpointWriter : Submitted checkpoint of time 1608051658000 ms to writer queue
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.015 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051658000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051658000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.021 INFO 1 --- [ Thread-24] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.021 INFO 1 --- [ Thread-24] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062628_321804
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.022 INFO 1 --- [ Thread-24] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.024 WARN 1 --- [ Thread-24] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.025 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 1 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051658000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.025 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051658000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051658000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.029 INFO 1 --- [ Thread-25] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.029 INFO 1 --- [ Thread-25] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062629_321805
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.030 INFO 1 --- [ Thread-25] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.032 WARN 1 --- [ Thread-25] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | 2020-12-15 17:00:58.032 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 2 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051658000'
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 |
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.tqjmcv01jx1l@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/json4s-core_2.11-3.5.3.jar at spark://5c1f7f9d3fca:44051/jars/json4s-core_2.11-3.5.3.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/spring-tx-5.2.4.RELEASE.jar at spark://5c1f7f9d3fca:44051/jars/spring-tx-5.2.4.RELEASE.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/orc-core-1.5.5-nohive.jar at spark://5c1f7f9d3fca:44051/jars/orc-core-1.5.5-nohive.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/error_prone_annotations-2.3.4.jar at spark://5c1f7f9d3fca:44051/jars/error_prone_annotations-2.3.4.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.290 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/libs/paranamer-2.8.jar at spark://5c1f7f9d3fca:44051/jars/paranamer-2.8.jar with timestamp 1608051693290
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.291 INFO 1 --- [ main] org.apache.spark.SparkContext : Added JAR /app/classpath/dangerous-good-processor.original.jar at spark://5c1f7f9d3fca:44051/jars/dangerous-good-processor.original.jar with timestamp 1608051693291
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:33.628 INFO 1 --- [er-threadpool-0] s.d.c.StandaloneAppClient$ClientEndpoint : Connecting to master spark://spark-master:7077...
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.004 INFO 1 --- [pc-connection-0] o.a.s.n.client.TransportClientFactory : Successfully created connection to spark-master/10.0.6.246:7077 after 218 ms (0 ms spent in bootstraps)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.184 INFO 1 --- [er-event-loop-3] o.a.s.s.c.StandaloneSchedulerBackend : Connected to Spark cluster with app ID app-20201215170134-0006
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.197 INFO 1 --- [er-event-loop-3] s.d.c.StandaloneAppClient$ClientEndpoint : Executor added: app-20201215170134-0006/0 on worker-20201215165722-10.0.6.251-38117 (10.0.6.251:38117) with 2 core(s)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.197 INFO 1 --- [er-event-loop-3] o.a.s.s.c.StandaloneSchedulerBackend : Granted executor ID app-20201215170134-0006/0 on hostPort 10.0.6.251:38117 with 2 core(s), 1024.0 MB RAM
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.199 INFO 1 --- [er-event-loop-3] s.d.c.StandaloneAppClient$ClientEndpoint : Executor updated: app-20201215170134-0006/0 is now RUNNING
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.220 INFO 1 --- [ main] org.apache.spark.util.Utils : Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 36119.
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.221 INFO 1 --- [ main] o.a.s.n.netty.NettyBlockTransferService : Server created on 5c1f7f9d3fca:36119
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.221 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.245 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registering BlockManager BlockManagerId(driver, 5c1f7f9d3fca, 36119, None)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.261 INFO 1 --- [er-event-loop-0] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 5c1f7f9d3fca:36119 with 9.2 GB RAM, BlockManagerId(driver, 5c1f7f9d3fca, 36119, None)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.268 INFO 1 --- [ main] o.a.spark.storage.BlockManagerMaster : Registered BlockManager BlockManagerId(driver, 5c1f7f9d3fca, 36119, None)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.269 INFO 1 --- [ main] org.apache.spark.storage.BlockManager : Initialized BlockManager: BlockManagerId(driver, 5c1f7f9d3fca, 36119, None)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.411 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@603d25db{/metrics/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:34.468 INFO 1 --- [ main] o.a.s.s.c.StandaloneSchedulerBackend : SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.235 WARN 1 --- [ main] org.apache.spark.SparkContext : Using an existing SparkContext; some configuration may not take effect.
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.625 INFO 1 --- [ main] org.mongodb.driver.cluster : Cluster created with settings {hosts=[mongo-server:27017], mode=MULTIPLE, requiredClusterType=REPLICA_SET, serverSelectionTimeout='30000 ms', maxWaitQueueSize=500, requiredReplicaSetName='rs0'}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.625 INFO 1 --- [ main] org.mongodb.driver.cluster : Adding discovered server mongo-server:27017 to client view of cluster
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.671 INFO 1 --- [go-server:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:1, serverValue:24}] to mongo-server:27017
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.690 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-server:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=3326337, setName='rs0', canonicalAddress=mongo-server:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=7fffffff000000000000001d, setVersion=37028, lastWriteDate=Tue Dec 15 17:01:35 UTC 2020, lastUpdateTimeNanos=1845412210771588}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.692 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-0:27017 to client view of cluster
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.698 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Adding discovered server mongo-replica-1:27017 to client view of cluster
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.700 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max election id to 7fffffff000000000000001d from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.701 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Setting max set version to 37028 from replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.702 INFO 1 --- [go-server:27017] org.mongodb.driver.cluster : Discovered replica set primary mongo-server:27017
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.723 INFO 1 --- [replica-1:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:3, serverValue:18}] to mongo-replica-1:27017
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.745 INFO 1 --- [replica-0:27017] org.mongodb.driver.connection : Opened connection [connectionId{localValue:2, serverValue:19}] to mongo-replica-0:27017
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.748 INFO 1 --- [replica-0:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-0:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=2433255, setName='rs0', canonicalAddress=mongo-replica-0:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:01:35 UTC 2020, lastUpdateTimeNanos=1845412282433927}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:35.749 INFO 1 --- [replica-1:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=mongo-replica-1:27017, type=REPLICA_SET_SECONDARY, state=CONNECTED, ok=true, version=ServerVersion{versionList=[4, 4, 2]}, minWireVersion=0, maxWireVersion=9, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=25465205, setName='rs0', canonicalAddress=mongo-replica-1:27017, hosts=[mongo-server:27017], passives=[mongo-replica-0:27017, mongo-replica-1:27017], arbiters=[], primary='mongo-server:27017', tagSet=TagSet{[]}, electionId=null, setVersion=37028, lastWriteDate=Tue Dec 15 17:01:35 UTC 2020, lastUpdateTimeNanos=1845412283706636}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:36.427 INFO 1 --- [ main] org.mongodb.driver.connection : Opened connection [connectionId{localValue:4, serverValue:25}] to mongo-server:27017
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | ===========================================================
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | PortalMetadata(id=5f51f7ecdf7c147d8529cc8d, portalName=P7, expiration=60, asset=com.tecnositaf.dg.processor.model.assets.Asset@1a712f12)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.112 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0 stored as values in memory (estimated size 22.7 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.170 INFO 1 --- [ main] o.a.spark.storage.memory.MemoryStore : Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.2 KB, free 9.2 GB)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.173 INFO 1 --- [er-event-loop-0] o.apache.spark.storage.BlockManagerInfo : Added broadcast_0_piece0 in memory on 5c1f7f9d3fca:36119 (size: 2.2 KB, free: 9.2 GB)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.178 INFO 1 --- [ main] org.apache.spark.SparkContext : Created broadcast 0 from broadcast at RealtimeProcessor.java:81
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.193 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/spark-warehouse').
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.194 INFO 1 --- [ main] o.apache.spark.sql.internal.SharedState : Warehouse path is 'file:/spark-warehouse'.
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.204 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2604940{/SQL,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.205 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@2e564c78{/SQL/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.205 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@da5b46f{/SQL/execution,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.206 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@99f75e4{/SQL/execution/json,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.207 INFO 1 --- [ main] o.s.jetty.server.handler.ContextHandler : Started o.s.j.s.ServletContextHandler@7827cdfc{/static/sql,null,AVAILABLE,@Spark}
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.332 INFO 1 --- [r-event-loop-13] seGrainedSchedulerBackend$DriverEndpoint : Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.0.6.251:45762) with ID 0
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.533 INFO 1 --- [er-event-loop-4] o.a.s.s.BlockManagerMasterEndpoint : Registering block manager 10.0.6.251:38325 with 366.3 MB RAM, BlockManagerId(0, 10.0.6.251, 38325, None)
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.644 INFO 1 --- [ main] o.a.s.s.e.s.s.StateStoreCoordinatorRef : Registered StateStoreCoordinator endpoint
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.861 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding enable.auto.commit to false for executor
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.861 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding auto.offset.reset to none for executor
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.861 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding executor group.id to spark-executor-realtime-group
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:37.862 WARN 1 --- [ main] o.a.spark.streaming.kafka010.KafkaUtils : overriding receive.buffer.bytes to 65536 see KAFKA-3135
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | starting dangerous good processor
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.130 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.ShuffledDStream@a23b96b
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.130 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.MappedDStream@b3c7c75
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.130 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.dstream.TransformedDStream@2297c8bf
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.130 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Duration for remembering RDDs set to 40000 ms for org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@3dbcde0b
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.136 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.136 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@3dbcde0b
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@2297c8bf
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.137 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Remember interval = 2000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ForEachDStream : Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@6c3830ed
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.k.DirectKafkaInputDStream : Initialized and validated org.apache.spark.streaming.kafka010.DirectKafkaInputDStream@3dbcde0b
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.dstream.TransformedDStream : Initialized and validated org.apache.spark.streaming.dstream.TransformedDStream@2297c8bf
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.MappedDStream : Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@b3c7c75
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Storage level = Serialized 1x Replicated
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Checkpoint interval = null
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Remember interval = 40000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.streaming.dstream.ShuffledDStream : Initialized and validated org.apache.spark.streaming.dstream.ShuffledDStream@a23b96b
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Slide time = 2000 ms
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Storage level = Memory Deserialized 1x Replicated
dg-stack_boot-spark-processor.1.t7773pk99g34@ns3170655 | 2020-12-15 17:01:38.138 INFO 1 --- [streaming-start] o.a.s.s.d.InternalMapWithStateDStream : Checkpoint interval = 20000 ms
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.011 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Could not write checkpoint for time 1608051824000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051824000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.012 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051826000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051826000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.015 INFO 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.015 INFO 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062911_322087
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.020 INFO 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.023 WARN 1 --- [ Thread-21] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.028 WARN 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Error in attempt 1 of writing checkpoint to 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051826000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy67.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1528) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1345) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.028 INFO 1 --- [ool-22-thread-1] o.a.spark.streaming.CheckpointWriter : Saving checkpoint for time 1608051826000 ms to file 'hdfs://namenode:8020/dangerousgoods/checkpoint/checkpoint-1608051826000'
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.065 INFO 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : Exception in createBlockOutputStream
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | java.net.ConnectException: Connection refused
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.base/sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1606) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1404) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.065 INFO 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : Abandoning BP-179465841-10.0.6.93-1607104606284:blk_1074062912_322088
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.069 INFO 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : Excluding datanode 10.0.6.241:50010
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | 2020-12-15 17:03:47.078 WARN 1 --- [ Thread-22] org.apache.hadoop.hdfs.DFSClient : DataStreamer Exception
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | org.apache.hadoop.ipc.RemoteException: File /dangerousgoods/checkpoint/temp could only be replicated to 0 nodes instead of minReplication (=1). There are 1 datanode(s) running and 1 node(s) are excluded in this operation.
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1733)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSDirWriteFileOp.chooseTargetForNewBlock(FSDirWriteFileOp.java:265)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2496)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:828)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:447)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:845)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:788)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at java.security.AccessController.doPrivileged(Native Method)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at javax.security.auth.Subject.doAs(Subject.java:422)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2455)
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 |
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1470) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.Client.call(Client.java:1401) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) ~[hadoop-common-2.6.5.jar:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at com.sun.proxy.$Proxy66.addBlock(Unknown Source) ~[na:na]
dg-stack_boot-spark-processor.1.wp7ooo9tz7d8@ns3170655 | at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:399) ~[hadoop-hdfs-2.6.5.jar:na]
dg-stack_boot-spark-pro
$ docker service logs dg-stack_datanode
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | Configuring core
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | - Setting fs.defaultFS=hdfs://namenode:8020
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | Configuring hdfs
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | - Setting dfs.namenode.datanode.registration.ip-hostname-check=false
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | - Setting dfs.datanode.data.dir=file:///hadoop/dfs/data
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | - Setting dfs.replication= 1
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | Configuring yarn
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | Configuring httpfs
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | Configuring kms
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | Configuring for multihomed network
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | 20/12/15 16:57:30 INFO datanode.DataNode: STARTUP_MSG:
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | /************************************************************
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | STARTUP_MSG: Starting DataNode
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | STARTUP_MSG: user = root
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | STARTUP_MSG: host = datanode/10.0.6.7
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | STARTUP_MSG: args = []
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | STARTUP_MSG: version = 2.8.0
dg-stack_datanode.1.t0vfyxzofqyq@ns3170655 | STARTUP_MSG: classpath = /etc/hadoop:/opt/hadoop-2.8.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/hadoop-auth-2.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/nimbus-jose-jwt-3.9.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-logging-1.1.3.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/avro-1.7.4.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/servlet-api-2.5.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/hamcrest-core-1.3.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jcip-annotations-1.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/xmlenc-0.52.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jetty-6.1.26.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/htrace-core4-4.0.1-incubating.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/zookeeper-3.4.6.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-io-2.4.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/gson-2.2.4.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jersey-core-1.9.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/curator-framework-2.7.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jettison-1.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/paranamer-2.3.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/httpcore-4.4.4.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/stax-api-1.0-2.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/log4j-1.2.17.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/httpclient-4.5.2.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/guava-11.0.2.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-cli-1.2.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-math3-3.1.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-digester-1.8.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/hadoop-annotations-2.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/junit-4.11.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-net-3.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/curator-client-2.7.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/activation-1.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jsp-api-2.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jsch-0.1.51.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-lang-2.6.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jersey-server-1.9.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/xz-1.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/json-smart-1.1.1.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jets3t-0.9.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-codec-1.4.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jersey-json-1.9.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jetty-sslengine-6.1.26.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-collections-3.2.2.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/jsr305-3.0.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/lib/asm-3.2.jar:/opt/hadoop-2.8.0/share/hadoop/common/hadoop-common-2.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/hadoop-nfs-2.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/common/hadoop-common-2.8.0-tests.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/htrace-core4-4.0.1-incubating.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/hadoop-hdfs-client-2.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/commons-io-2.4.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/okio-1.4.0.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/asm-3.2.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/lib/okhttp-2.4.0.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/hadoop-hdfs-client-2.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/hadoop-hdfs-client-2.8.0-tests.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/hadoop-hdfs-native-client-2.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/hadoop-hdfs-2.8.0-tests.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/hadoop-hdfs-2.8.0.jar:/opt/hadoop-2.8.0/share/hadoop/hdfs/hadoop-hdfs-native-client-2.8.0-tests.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/objenesis-2.1.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/servlet-api-2.5.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/jetty-6.1.26.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/javassist-3.18.1-GA.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/commons-io-2.4.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/jettison-1.1.jar:/opt/hadoop-2.8.0/share/hadoop/yarn/lib/aopalliance-1.0.jar
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment