Skip to content

Instantly share code, notes, and snippets.

@paul-lupu
Created March 14, 2016 12:45
Show Gist options
  • Save paul-lupu/e70cbf7daf05e305d4e1 to your computer and use it in GitHub Desktop.
Save paul-lupu/e70cbf7daf05e305d4e1 to your computer and use it in GitHub Desktop.
Spark Pi tutorial output
[root@sandbox spark-client]# su spark
[spark@sandbox spark-client]$ ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-client --num-executors 3 --driver-memory 512m --executor-memory 512m --executor-cores 1 lib/spark-examples*.jar 10
16/03/14 12:42:52 INFO SparkContext: Running Spark version 1.6.0
16/03/14 12:42:53 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/03/14 12:42:54 INFO SecurityManager: Changing view acls to: spark
16/03/14 12:42:54 INFO SecurityManager: Changing modify acls to: spark
16/03/14 12:42:54 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark)
16/03/14 12:42:54 INFO Utils: Successfully started service 'sparkDriver' on port 51735.
16/03/14 12:42:55 INFO Slf4jLogger: Slf4jLogger started
16/03/14 12:42:55 INFO Remoting: Starting remoting
16/03/14 12:42:56 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@172.16.255.159:44187]
16/03/14 12:42:56 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 44187.
16/03/14 12:42:56 INFO SparkEnv: Registering MapOutputTracker
16/03/14 12:42:56 INFO SparkEnv: Registering BlockManagerMaster
16/03/14 12:42:56 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-d8a21658-5f86-4d80-9a7d-642b22d9f618
16/03/14 12:42:56 INFO MemoryStore: MemoryStore started with capacity 143.6 MB
16/03/14 12:42:56 INFO SparkEnv: Registering OutputCommitCoordinator
16/03/14 12:42:56 INFO Server: jetty-8.y.z-SNAPSHOT
16/03/14 12:42:56 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040
16/03/14 12:42:56 INFO Utils: Successfully started service 'SparkUI' on port 4040.
16/03/14 12:42:56 INFO SparkUI: Started SparkUI at http://172.16.255.159:4040
16/03/14 12:42:57 INFO HttpFileServer: HTTP File server directory is /tmp/spark-4128c37e-4b3b-4d08-a8a1-1f24986c1106/httpd-b8115f3e-61d8-47ad-b5af-cda4d210591e
16/03/14 12:42:57 INFO HttpServer: Starting HTTP Server
16/03/14 12:42:57 INFO Server: jetty-8.y.z-SNAPSHOT
16/03/14 12:42:57 INFO AbstractConnector: Started SocketConnector@0.0.0.0:47393
16/03/14 12:42:57 INFO Utils: Successfully started service 'HTTP file server' on port 47393.
16/03/14 12:42:57 INFO SparkContext: Added JAR file:/usr/hdp/2.4.0.0-169/spark/lib/spark-examples-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar at http://172.16.255.159:47393/jars/spark-examples-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar with timestamp 1457959377576
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
16/03/14 12:42:58 INFO TimelineClientImpl: Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/
16/03/14 12:42:58 INFO RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/172.16.255.159:8050
16/03/14 12:42:59 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
16/03/14 12:42:59 INFO Client: Requesting a new application from cluster with 1 NodeManagers
16/03/14 12:42:59 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2250 MB per container)
16/03/14 12:42:59 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
16/03/14 12:42:59 INFO Client: Setting up container launch context for our AM
16/03/14 12:42:59 INFO Client: Setting up the launch environment for our AM container
16/03/14 12:43:00 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://sandbox.hortonworks.com:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
16/03/14 12:43:00 INFO Client: Preparing resources for our AM container
16/03/14 12:43:00 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://sandbox.hortonworks.com:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
16/03/14 12:43:00 INFO Client: Source and destination file systems are the same. Not copying hdfs://sandbox.hortonworks.com:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar
16/03/14 12:43:00 INFO Client: Uploading resource file:/tmp/spark-4128c37e-4b3b-4d08-a8a1-1f24986c1106/__spark_conf__7141156334140661248.zip -> hdfs://sandbox.hortonworks.com:8020/user/spark/.sparkStaging/application_1457958814604_0001/__spark_conf__7141156334140661248.zip
16/03/14 12:43:00 INFO SecurityManager: Changing view acls to: spark
16/03/14 12:43:00 INFO SecurityManager: Changing modify acls to: spark
16/03/14 12:43:00 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark)
16/03/14 12:43:00 INFO Client: Submitting application 1 to ResourceManager
16/03/14 12:43:01 INFO YarnClientImpl: Submitted application application_1457958814604_0001
16/03/14 12:43:01 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1457958814604_0001 and attemptId None
16/03/14 12:43:02 INFO Client: Application report for application_1457958814604_0001 (state: ACCEPTED)
16/03/14 12:43:02 INFO Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1457959380965
final status: UNDEFINED
tracking URL: http://sandbox.hortonworks.com:8088/proxy/application_1457958814604_0001/
user: spark
16/03/14 12:43:03 INFO Client: Application report for application_1457958814604_0001 (state: ACCEPTED)
16/03/14 12:43:04 INFO Client: Application report for application_1457958814604_0001 (state: ACCEPTED)
16/03/14 12:43:05 INFO Client: Application report for application_1457958814604_0001 (state: ACCEPTED)
16/03/14 12:43:06 INFO Client: Application report for application_1457958814604_0001 (state: ACCEPTED)
16/03/14 12:43:07 INFO Client: Application report for application_1457958814604_0001 (state: ACCEPTED)
16/03/14 12:43:08 INFO Client: Application report for application_1457958814604_0001 (state: ACCEPTED)
16/03/14 12:43:09 INFO Client: Application report for application_1457958814604_0001 (state: ACCEPTED)
16/03/14 12:43:10 INFO Client: Application report for application_1457958814604_0001 (state: ACCEPTED)
16/03/14 12:43:10 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null)
16/03/14 12:43:10 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> sandbox.hortonworks.com, PROXY_URI_BASES -> http://sandbox.hortonworks.com:8088/proxy/application_1457958814604_0001), /proxy/application_1457958814604_0001
16/03/14 12:43:10 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
16/03/14 12:43:11 INFO Client: Application report for application_1457958814604_0001 (state: RUNNING)
16/03/14 12:43:11 INFO Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: 172.16.255.159
ApplicationMaster RPC port: 0
queue: default
start time: 1457959380965
final status: UNDEFINED
tracking URL: http://sandbox.hortonworks.com:8088/proxy/application_1457958814604_0001/
user: spark
16/03/14 12:43:11 INFO YarnClientSchedulerBackend: Application application_1457958814604_0001 has started running.
16/03/14 12:43:11 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45331.
16/03/14 12:43:11 INFO NettyBlockTransferService: Server created on 45331
16/03/14 12:43:11 INFO BlockManagerMaster: Trying to register BlockManager
16/03/14 12:43:11 INFO BlockManagerMasterEndpoint: Registering block manager 172.16.255.159:45331 with 143.6 MB RAM, BlockManagerId(driver, 172.16.255.159, 45331)
16/03/14 12:43:11 INFO BlockManagerMaster: Registered BlockManager
16/03/14 12:43:12 INFO EventLoggingListener: Logging events to hdfs:///spark-history/application_1457958814604_0001
16/03/14 12:43:19 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (sandbox.hortonworks.com:53096) with ID 1
16/03/14 12:43:19 INFO BlockManagerMasterEndpoint: Registering block manager sandbox.hortonworks.com:41159 with 143.6 MB RAM, BlockManagerId(1, sandbox.hortonworks.com, 41159)
16/03/14 12:43:27 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms)
16/03/14 12:43:28 INFO SparkContext: Starting job: reduce at SparkPi.scala:36
16/03/14 12:43:28 INFO DAGScheduler: Got job 0 (reduce at SparkPi.scala:36) with 10 output partitions
16/03/14 12:43:28 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at SparkPi.scala:36)
16/03/14 12:43:28 INFO DAGScheduler: Parents of final stage: List()
16/03/14 12:43:28 INFO DAGScheduler: Missing parents: List()
16/03/14 12:43:28 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:32), which has no missing parents
16/03/14 12:43:28 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1888.0 B, free 1888.0 B)
16/03/14 12:43:28 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1202.0 B, free 3.0 KB)
16/03/14 12:43:28 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 172.16.255.159:45331 (size: 1202.0 B, free: 143.6 MB)
16/03/14 12:43:28 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1006
16/03/14 12:43:28 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:32)
16/03/14 12:43:28 INFO YarnScheduler: Adding task set 0.0 with 10 tasks
16/03/14 12:43:28 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, sandbox.hortonworks.com, partition 0,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:30 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on sandbox.hortonworks.com:41159 (size: 1202.0 B, free: 143.6 MB)
16/03/14 12:43:30 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, sandbox.hortonworks.com, partition 1,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:30 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 2476 ms on sandbox.hortonworks.com (1/10)
16/03/14 12:43:31 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, sandbox.hortonworks.com, partition 2,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:31 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 278 ms on sandbox.hortonworks.com (2/10)
16/03/14 12:43:31 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, sandbox.hortonworks.com, partition 3,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:31 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 137 ms on sandbox.hortonworks.com (3/10)
16/03/14 12:43:31 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, sandbox.hortonworks.com, partition 4,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:31 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 320 ms on sandbox.hortonworks.com (4/10)
16/03/14 12:43:31 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, sandbox.hortonworks.com, partition 5,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:31 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 52 ms on sandbox.hortonworks.com (5/10)
16/03/14 12:43:31 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, sandbox.hortonworks.com, partition 6,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:31 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 50 ms on sandbox.hortonworks.com (6/10)
16/03/14 12:43:31 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, sandbox.hortonworks.com, partition 7,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:31 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 45 ms on sandbox.hortonworks.com (7/10)
16/03/14 12:43:31 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, sandbox.hortonworks.com, partition 8,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:31 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 56 ms on sandbox.hortonworks.com (8/10)
16/03/14 12:43:31 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, sandbox.hortonworks.com, partition 9,PROCESS_LOCAL, 2181 bytes)
16/03/14 12:43:31 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 55 ms on sandbox.hortonworks.com (9/10)
16/03/14 12:43:31 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 49 ms on sandbox.hortonworks.com (10/10)
16/03/14 12:43:31 INFO DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:36) finished in 3.413 s
16/03/14 12:43:31 INFO YarnScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/03/14 12:43:31 INFO DAGScheduler: Job 0 finished: reduce at SparkPi.scala:36, took 3.811183 s
Pi is roughly 3.142108
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
16/03/14 12:43:31 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null}
16/03/14 12:43:31 INFO SparkUI: Stopped Spark web UI at http://172.16.255.159:4040
16/03/14 12:43:31 INFO YarnClientSchedulerBackend: Shutting down all executors
16/03/14 12:43:31 INFO YarnClientSchedulerBackend: Interrupting monitor thread
16/03/14 12:43:31 INFO YarnClientSchedulerBackend: Asking each executor to shut down
16/03/14 12:43:31 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices
(serviceOption=None,
services=List(),
started=false)
16/03/14 12:43:31 INFO YarnClientSchedulerBackend: Stopped
16/03/14 12:43:32 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/03/14 12:43:32 INFO MemoryStore: MemoryStore cleared
16/03/14 12:43:32 INFO BlockManager: BlockManager stopped
16/03/14 12:43:32 INFO BlockManagerMaster: BlockManagerMaster stopped
16/03/14 12:43:32 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/03/14 12:43:32 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
16/03/14 12:43:32 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
16/03/14 12:43:32 INFO SparkContext: Successfully stopped SparkContext
16/03/14 12:43:32 INFO ShutdownHookManager: Shutdown hook called
16/03/14 12:43:32 INFO ShutdownHookManager: Deleting directory /tmp/spark-4128c37e-4b3b-4d08-a8a1-1f24986c1106
16/03/14 12:43:32 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
16/03/14 12:43:32 INFO ShutdownHookManager: Deleting directory /tmp/spark-4128c37e-4b3b-4d08-a8a1-1f24986c1106/httpd-b8115f3e-61d8-47ad-b5af-cda4d210591e
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment