Skip to content

Instantly share code, notes, and snippets.

@worace
Last active October 16, 2017 21:28
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save worace/ce7cfcbcb7f24c7f1af34751ce9184b7 to your computer and use it in GitHub Desktop.
Save worace/ce7cfcbcb7f24c7f1af34751ce9184b7 to your computer and use it in GitHub Desktop.
Accumulators Log Output
Compiling accumulators.core
log4j:WARN No appenders could be found for logger (accumulators.core).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/10/16 14:28:30 INFO SparkContext: Running Spark version 2.1.0
17/10/16 14:28:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/10/16 14:28:30 INFO SecurityManager: Changing view acls to: horace
17/10/16 14:28:30 INFO SecurityManager: Changing modify acls to: horace
17/10/16 14:28:30 INFO SecurityManager: Changing view acls groups to:
17/10/16 14:28:30 INFO SecurityManager: Changing modify acls groups to:
17/10/16 14:28:30 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(horace); groups with view permissions: Set(); users with modify permissions: Set(horace); groups with modify permissions: Set()
17/10/16 14:28:31 INFO Utils: Successfully started service 'sparkDriver' on port 62197.
17/10/16 14:28:31 INFO SparkEnv: Registering MapOutputTracker
17/10/16 14:28:31 INFO SparkEnv: Registering BlockManagerMaster
17/10/16 14:28:31 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/10/16 14:28:31 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/10/16 14:28:31 INFO DiskBlockManager: Created local directory at /private/var/folders/lk/5zj87sj92cqcb2kfz8b_gywm0000gn/T/blockmgr-e4049546-c660-4b0f-8037-24a52567865b
17/10/16 14:28:31 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
17/10/16 14:28:31 INFO SparkEnv: Registering OutputCommitCoordinator
17/10/16 14:28:31 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/10/16 14:28:31 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.0.201.111:4040
17/10/16 14:28:31 INFO Executor: Starting executor ID driver on host localhost
17/10/16 14:28:31 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 62198.
17/10/16 14:28:31 INFO NettyBlockTransferService: Server created on 10.0.201.111:62198
17/10/16 14:28:31 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/10/16 14:28:31 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.0.201.111, 62198, None)
17/10/16 14:28:31 INFO BlockManagerMasterEndpoint: Registering block manager 10.0.201.111:62198 with 2004.6 MB RAM, BlockManagerId(driver, 10.0.201.111, 62198, None)
17/10/16 14:28:31 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.0.201.111, 62198, None)
17/10/16 14:28:31 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.0.201.111, 62198, None)
17/10/16 14:28:32 WARN core: *** Input RDD: #object[org.apache.spark.api.java.JavaRDD 0x6f044c58 ParallelCollectionRDD[0] at parallelize at NativeMethodAccessorImpl.java:0]
17/10/16 14:28:32 INFO SparkContext: Starting job: foreach at NativeMethodAccessorImpl.java:0
17/10/16 14:28:32 INFO DAGScheduler: Got job 0 (foreach at NativeMethodAccessorImpl.java:0) with 1 output partitions
17/10/16 14:28:32 INFO DAGScheduler: Final stage: ResultStage 0 (foreach at NativeMethodAccessorImpl.java:0)
17/10/16 14:28:32 INFO DAGScheduler: Parents of final stage: List()
17/10/16 14:28:32 INFO DAGScheduler: Missing parents: List()
17/10/16 14:28:32 INFO DAGScheduler: Submitting ResultStage 0 (ParallelCollectionRDD[0] at parallelize at NativeMethodAccessorImpl.java:0), which has no missing parents
17/10/16 14:28:32 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 3.0 KB, free 2004.6 MB)
17/10/16 14:28:32 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1670.0 B, free 2004.6 MB)
17/10/16 14:28:32 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.0.201.111:62198 (size: 1670.0 B, free: 2004.6 MB)
17/10/16 14:28:32 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:996
17/10/16 14:28:32 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (ParallelCollectionRDD[0] at parallelize at NativeMethodAccessorImpl.java:0)
17/10/16 14:28:32 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks
17/10/16 14:28:32 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 5993 bytes)
17/10/16 14:28:32 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: A ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 0)]
17/10/16 14:28:32 WARN core: Counter value: 0
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: B ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 1)]
17/10/16 14:28:32 WARN core: Counter value: 1
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: C ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 2)]
17/10/16 14:28:32 WARN core: Counter value: 2
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: D ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 3)]
17/10/16 14:28:32 WARN core: Counter value: 3
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: E ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 4)]
17/10/16 14:28:32 WARN core: Counter value: 4
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: F ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 5)]
17/10/16 14:28:32 WARN core: Counter value: 5
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: G ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 6)]
17/10/16 14:28:32 WARN core: Counter value: 6
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: H ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 7)]
17/10/16 14:28:32 WARN core: Counter value: 7
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: I ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 8)]
17/10/16 14:28:32 WARN core: Counter value: 8
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: J ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 9)]
17/10/16 14:28:32 WARN core: Counter value: 9
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: K ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 10)]
17/10/16 14:28:32 WARN core: Counter value: 10
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: L ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 11)]
17/10/16 14:28:32 WARN core: Counter value: 11
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: M ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 12)]
17/10/16 14:28:32 WARN core: Counter value: 12
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: N ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 13)]
17/10/16 14:28:32 WARN core: Counter value: 13
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: O ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 14)]
17/10/16 14:28:32 WARN core: Counter value: 14
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: P ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 15)]
17/10/16 14:28:32 WARN core: Counter value: 15
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: Q ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 16)]
17/10/16 14:28:32 WARN core: Counter value: 16
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: R ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 17)]
17/10/16 14:28:32 WARN core: Counter value: 17
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: S ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 18)]
17/10/16 14:28:32 WARN core: Counter value: 18
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: T ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 19)]
17/10/16 14:28:32 WARN core: Counter value: 19
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: U ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 20)]
17/10/16 14:28:32 WARN core: Counter value: 20
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: V ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 21)]
17/10/16 14:28:32 WARN core: Counter value: 21
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: W ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 22)]
17/10/16 14:28:32 WARN core: Counter value: 22
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: X ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 23)]
17/10/16 14:28:32 WARN core: Counter value: 23
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: Y ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 24)]
17/10/16 14:28:32 WARN core: Counter value: 24
17/10/16 14:28:32 WARN core: *** ForEach Processing Char: Z ***
17/10/16 14:28:32 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x526eadf1 LongAccumulator(id: 1, name: Some(num-lines-action), value: 25)]
17/10/16 14:28:32 WARN core: Counter value: 25
17/10/16 14:28:32 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 883 bytes result sent to driver
17/10/16 14:28:32 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 189 ms on localhost (executor driver) (1/1)
17/10/16 14:28:32 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
17/10/16 14:28:32 INFO DAGScheduler: ResultStage 0 (foreach at NativeMethodAccessorImpl.java:0) finished in 0.205 s
17/10/16 14:28:32 INFO DAGScheduler: Job 0 finished: foreach at NativeMethodAccessorImpl.java:0, took 0.479866 s
17/10/16 14:28:32 WARN core: Increment Counter Executor
17/10/16 14:28:32 WARN core: *** Counter Executor Value: 1
17/10/16 14:28:32 WARN core: *** Will Save RDD to file ***
17/10/16 14:28:33 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
17/10/16 14:28:33 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
17/10/16 14:28:33 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
17/10/16 14:28:33 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
17/10/16 14:28:33 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
17/10/16 14:28:33 INFO SparkContext: Starting job: saveAsTextFile at NativeMethodAccessorImpl.java:0
17/10/16 14:28:33 INFO DAGScheduler: Got job 1 (saveAsTextFile at NativeMethodAccessorImpl.java:0) with 1 output partitions
17/10/16 14:28:33 INFO DAGScheduler: Final stage: ResultStage 1 (saveAsTextFile at NativeMethodAccessorImpl.java:0)
17/10/16 14:28:33 INFO DAGScheduler: Parents of final stage: List()
17/10/16 14:28:33 INFO DAGScheduler: Missing parents: List()
17/10/16 14:28:33 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[2] at saveAsTextFile at NativeMethodAccessorImpl.java:0), which has no missing parents
17/10/16 14:28:33 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 49.6 KB, free 2004.5 MB)
17/10/16 14:28:33 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 17.8 KB, free 2004.5 MB)
17/10/16 14:28:33 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.0.201.111:62198 (size: 17.8 KB, free: 2004.6 MB)
17/10/16 14:28:33 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:996
17/10/16 14:28:33 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 1 (MapPartitionsRDD[2] at saveAsTextFile at NativeMethodAccessorImpl.java:0)
17/10/16 14:28:33 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks
17/10/16 14:28:33 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1, localhost, executor driver, partition 0, PROCESS_LOCAL, 6000 bytes)
17/10/16 14:28:33 INFO Executor: Running task 0.0 in stage 1.0 (TID 1)
17/10/16 14:28:33 INFO deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
17/10/16 14:28:33 INFO deprecation: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class
17/10/16 14:28:33 INFO deprecation: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class
17/10/16 14:28:33 INFO deprecation: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir
17/10/16 14:28:33 WARN core: *** Map Processing Char: A ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 0)]
17/10/16 14:28:33 WARN core: Counter value: 0
17/10/16 14:28:33 WARN core: *** Map Processing Char: B ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 1)]
17/10/16 14:28:33 WARN core: Counter value: 1
17/10/16 14:28:33 WARN core: *** Map Processing Char: C ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 2)]
17/10/16 14:28:33 WARN core: Counter value: 2
17/10/16 14:28:33 WARN core: *** Map Processing Char: D ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 3)]
17/10/16 14:28:33 WARN core: Counter value: 3
17/10/16 14:28:33 WARN core: *** Map Processing Char: E ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 4)]
17/10/16 14:28:33 WARN core: Counter value: 4
17/10/16 14:28:33 WARN core: *** Map Processing Char: F ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 5)]
17/10/16 14:28:33 WARN core: Counter value: 5
17/10/16 14:28:33 WARN core: *** Map Processing Char: G ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 6)]
17/10/16 14:28:33 WARN core: Counter value: 6
17/10/16 14:28:33 WARN core: *** Map Processing Char: H ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 7)]
17/10/16 14:28:33 WARN core: Counter value: 7
17/10/16 14:28:33 WARN core: *** Map Processing Char: I ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 8)]
17/10/16 14:28:33 WARN core: Counter value: 8
17/10/16 14:28:33 WARN core: *** Map Processing Char: J ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 9)]
17/10/16 14:28:33 WARN core: Counter value: 9
17/10/16 14:28:33 WARN core: *** Map Processing Char: K ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 10)]
17/10/16 14:28:33 WARN core: Counter value: 10
17/10/16 14:28:33 WARN core: *** Map Processing Char: L ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 11)]
17/10/16 14:28:33 WARN core: Counter value: 11
17/10/16 14:28:33 WARN core: *** Map Processing Char: M ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 12)]
17/10/16 14:28:33 WARN core: Counter value: 12
17/10/16 14:28:33 WARN core: *** Map Processing Char: N ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 13)]
17/10/16 14:28:33 WARN core: Counter value: 13
17/10/16 14:28:33 WARN core: *** Map Processing Char: O ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 14)]
17/10/16 14:28:33 WARN core: Counter value: 14
17/10/16 14:28:33 WARN core: *** Map Processing Char: P ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 15)]
17/10/16 14:28:33 WARN core: Counter value: 15
17/10/16 14:28:33 WARN core: *** Map Processing Char: Q ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 16)]
17/10/16 14:28:33 WARN core: Counter value: 16
17/10/16 14:28:33 WARN core: *** Map Processing Char: R ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 17)]
17/10/16 14:28:33 WARN core: Counter value: 17
17/10/16 14:28:33 WARN core: *** Map Processing Char: S ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 18)]
17/10/16 14:28:33 WARN core: Counter value: 18
17/10/16 14:28:33 WARN core: *** Map Processing Char: T ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 19)]
17/10/16 14:28:33 WARN core: Counter value: 19
17/10/16 14:28:33 WARN core: *** Map Processing Char: U ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 20)]
17/10/16 14:28:33 WARN core: Counter value: 20
17/10/16 14:28:33 WARN core: *** Map Processing Char: V ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 21)]
17/10/16 14:28:33 WARN core: Counter value: 21
17/10/16 14:28:33 WARN core: *** Map Processing Char: W ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 22)]
17/10/16 14:28:33 WARN core: Counter value: 22
17/10/16 14:28:33 WARN core: *** Map Processing Char: X ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 23)]
17/10/16 14:28:33 WARN core: Counter value: 23
17/10/16 14:28:33 WARN core: *** Map Processing Char: Y ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 24)]
17/10/16 14:28:33 WARN core: Counter value: 24
17/10/16 14:28:33 WARN core: *** Map Processing Char: Z ***
17/10/16 14:28:33 WARN core: Counter object: #object[org.apache.spark.util.LongAccumulator 0x4bec0eb8 LongAccumulator(id: 0, name: Some(num-lines), value: 25)]
17/10/16 14:28:33 WARN core: Counter value: 25
17/10/16 14:28:33 INFO FileOutputCommitter: Saved output of task 'attempt_20171016142833_0001_m_000000_1' to file:/tmp/accumulators_1508189311/output/_temporary/0/task_20171016142833_0001_m_000000
17/10/16 14:28:33 INFO SparkHadoopMapRedUtil: attempt_20171016142833_0001_m_000000_1: Committed
17/10/16 14:28:33 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 883 bytes result sent to driver
17/10/16 14:28:33 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 126 ms on localhost (executor driver) (1/1)
17/10/16 14:28:33 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool
17/10/16 14:28:33 INFO DAGScheduler: ResultStage 1 (saveAsTextFile at NativeMethodAccessorImpl.java:0) finished in 0.128 s
17/10/16 14:28:33 INFO DAGScheduler: Job 1 finished: saveAsTextFile at NativeMethodAccessorImpl.java:0, took 0.149717 s
17/10/16 14:28:33 WARN core: *** Counter object after saving rdd: #object[org.apache.spark.util.LongAccumulator 0x74606204 LongAccumulator(id: 0, name: Some(num-lines), value: 0)]
17/10/16 14:28:33 WARN core: *** Counter value after saving rdd: 0
17/10/16 14:28:33 WARN core: ******* Counter from Action ********
17/10/16 14:28:33 WARN core: *** Counter object after saving rdd: #object[org.apache.spark.util.LongAccumulator 0x321ca237 LongAccumulator(id: 1, name: Some(num-lines-action), value: 0)]
17/10/16 14:28:33 WARN core: *** Counter value after saving rdd: 0
17/10/16 14:28:33 WARN core: Results to: /tmp/accumulators_1508189311
17/10/16 14:28:33 INFO SparkContext: Starting job: saveAsTextFile at NativeMethodAccessorImpl.java:0
17/10/16 14:28:33 INFO DAGScheduler: Got job 2 (saveAsTextFile at NativeMethodAccessorImpl.java:0) with 1 output partitions
17/10/16 14:28:33 INFO DAGScheduler: Final stage: ResultStage 2 (saveAsTextFile at NativeMethodAccessorImpl.java:0)
17/10/16 14:28:33 INFO DAGScheduler: Parents of final stage: List()
17/10/16 14:28:33 INFO DAGScheduler: Missing parents: List()
17/10/16 14:28:33 INFO DAGScheduler: Submitting ResultStage 2 (MapPartitionsRDD[4] at saveAsTextFile at NativeMethodAccessorImpl.java:0), which has no missing parents
17/10/16 14:28:33 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 47.8 KB, free 2004.5 MB)
17/10/16 14:28:33 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 17.0 KB, free 2004.5 MB)
17/10/16 14:28:33 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on 10.0.201.111:62198 (size: 17.0 KB, free: 2004.6 MB)
17/10/16 14:28:33 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:996
17/10/16 14:28:33 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (MapPartitionsRDD[4] at saveAsTextFile at NativeMethodAccessorImpl.java:0)
17/10/16 14:28:33 INFO TaskSchedulerImpl: Adding task set 2.0 with 1 tasks
17/10/16 14:28:33 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 2, localhost, executor driver, partition 0, PROCESS_LOCAL, 5902 bytes)
17/10/16 14:28:33 INFO Executor: Running task 0.0 in stage 2.0 (TID 2)
17/10/16 14:28:33 INFO FileOutputCommitter: Saved output of task 'attempt_20171016142833_0002_m_000000_2' to file:/tmp/accumulators_1508189311/counters/_temporary/0/task_20171016142833_0002_m_000000
17/10/16 14:28:33 INFO SparkHadoopMapRedUtil: attempt_20171016142833_0002_m_000000_2: Committed
17/10/16 14:28:33 INFO Executor: Finished task 0.0 in stage 2.0 (TID 2). 970 bytes result sent to driver
17/10/16 14:28:33 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 2) in 44 ms on localhost (executor driver) (1/1)
17/10/16 14:28:33 INFO TaskSchedulerImpl: Removed TaskSet 2.0, whose tasks have all completed, from pool
17/10/16 14:28:33 INFO DAGScheduler: ResultStage 2 (saveAsTextFile at NativeMethodAccessorImpl.java:0) finished in 0.045 s
17/10/16 14:28:33 INFO DAGScheduler: Job 2 finished: saveAsTextFile at NativeMethodAccessorImpl.java:0, took 0.068931 s
17/10/16 14:28:33 INFO SparkContext: Invoking stop() from shutdown hook
17/10/16 14:28:33 INFO SparkUI: Stopped Spark web UI at http://10.0.201.111:4040
17/10/16 14:28:33 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
17/10/16 14:28:33 INFO MemoryStore: MemoryStore cleared
17/10/16 14:28:33 INFO BlockManager: BlockManager stopped
17/10/16 14:28:33 INFO BlockManagerMaster: BlockManagerMaster stopped
17/10/16 14:28:33 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
17/10/16 14:28:33 INFO SparkContext: Successfully stopped SparkContext
17/10/16 14:28:33 INFO ShutdownHookManager: Shutdown hook called
17/10/16 14:28:33 INFO ShutdownHookManager: Deleting directory /private/var/folders/lk/5zj87sj92cqcb2kfz8b_gywm0000gn/T/spark-13536f2a-2c52-4caa-93d0-b4bb57ec66a9
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment