Created
March 14, 2016 13:24
-
-
Save paul-lupu/9d2536511e996f2e57dd to your computer and use it in GitHub Desktop.
Spark PI tutorial output
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Last login: Mon Mar 14 13:21:35 2016 from 10.0.2.2 | |
[root@sandbox ~]# export SPARK_HOME=/usr/hdp/current/spark-client | |
[root@sandbox ~]# cd $SPARK_HOME | |
[root@sandbox spark-client]# | |
[root@sandbox spark-client]# netstat -lnptu | grep 4040 | |
[root@sandbox spark-client]# su spark | |
[spark@sandbox spark-client]$ ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-client --num-executors 3 --driver-memory 512m --executor-memory 512m --executor-cores 1 lib/spark-examples*.jar 10 | |
16/03/14 13:22:43 INFO SparkContext: Running Spark version 1.6.0 | |
16/03/14 13:22:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
16/03/14 13:22:45 INFO SecurityManager: Changing view acls to: spark | |
16/03/14 13:22:45 INFO SecurityManager: Changing modify acls to: spark | |
16/03/14 13:22:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark) | |
16/03/14 13:22:45 INFO Utils: Successfully started service 'sparkDriver' on port 38786. | |
16/03/14 13:22:46 INFO Slf4jLogger: Slf4jLogger started | |
16/03/14 13:22:46 INFO Remoting: Starting remoting | |
16/03/14 13:22:47 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.0.2.15:40305] | |
16/03/14 13:22:47 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 40305. | |
16/03/14 13:22:47 INFO SparkEnv: Registering MapOutputTracker | |
16/03/14 13:22:47 INFO SparkEnv: Registering BlockManagerMaster | |
16/03/14 13:22:47 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-23daebb1-592d-493d-8b6c-170abba61d3a | |
16/03/14 13:22:47 INFO MemoryStore: MemoryStore started with capacity 143.6 MB | |
16/03/14 13:22:47 INFO SparkEnv: Registering OutputCommitCoordinator | |
16/03/14 13:22:48 INFO Server: jetty-8.y.z-SNAPSHOT | |
16/03/14 13:22:48 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040 | |
16/03/14 13:22:48 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
16/03/14 13:22:48 INFO SparkUI: Started SparkUI at http://10.0.2.15:4040 | |
16/03/14 13:22:48 INFO HttpFileServer: HTTP File server directory is /tmp/spark-fd94b163-9007-4ab7-add6-1f5d8af037ec/httpd-7df2e16f-8de5-429e-9631-07655d96f9c5 | |
16/03/14 13:22:48 INFO HttpServer: Starting HTTP Server | |
16/03/14 13:22:48 INFO Server: jetty-8.y.z-SNAPSHOT | |
16/03/14 13:22:48 INFO AbstractConnector: Started SocketConnector@0.0.0.0:43825 | |
16/03/14 13:22:48 INFO Utils: Successfully started service 'HTTP file server' on port 43825. | |
16/03/14 13:22:49 INFO SparkContext: Added JAR file:/usr/hdp/2.4.0.0-169/spark/lib/spark-examples-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar at http://10.0.2.15:43825/jars/spark-examples-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar with timestamp 1457961769027 | |
spark.yarn.driver.memoryOverhead is set but does not apply in client mode. | |
16/03/14 13:22:49 INFO TimelineClientImpl: Timeline service address: http://sandbox.hortonworks.com:8188/ws/v1/timeline/ | |
16/03/14 13:22:50 INFO RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.15:8050 | |
16/03/14 13:22:51 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. | |
16/03/14 13:22:51 INFO Client: Requesting a new application from cluster with 1 NodeManagers | |
16/03/14 13:22:51 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2250 MB per container) | |
16/03/14 13:22:51 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead | |
16/03/14 13:22:51 INFO Client: Setting up container launch context for our AM | |
16/03/14 13:22:51 INFO Client: Setting up the launch environment for our AM container | |
16/03/14 13:22:51 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://sandbox.hortonworks.com:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar | |
16/03/14 13:22:51 INFO Client: Preparing resources for our AM container | |
16/03/14 13:22:51 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs://sandbox.hortonworks.com:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar | |
16/03/14 13:22:51 INFO Client: Source and destination file systems are the same. Not copying hdfs://sandbox.hortonworks.com:8020/hdp/apps/2.4.0.0-169/spark/spark-hdp-assembly.jar | |
16/03/14 13:22:52 INFO Client: Uploading resource file:/tmp/spark-fd94b163-9007-4ab7-add6-1f5d8af037ec/__spark_conf__950865978271557988.zip -> hdfs://sandbox.hortonworks.com:8020/user/spark/.sparkStaging/application_1457961042096_0001/__spark_conf__950865978271557988.zip | |
16/03/14 13:22:52 INFO SecurityManager: Changing view acls to: spark | |
16/03/14 13:22:52 INFO SecurityManager: Changing modify acls to: spark | |
16/03/14 13:22:52 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); users with modify permissions: Set(spark) | |
16/03/14 13:22:52 INFO Client: Submitting application 1 to ResourceManager | |
16/03/14 13:22:52 INFO YarnClientImpl: Submitted application application_1457961042096_0001 | |
16/03/14 13:22:52 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1457961042096_0001 and attemptId None | |
16/03/14 13:22:53 INFO Client: Application report for application_1457961042096_0001 (state: ACCEPTED) | |
16/03/14 13:22:54 INFO Client: | |
client token: N/A | |
diagnostics: N/A | |
ApplicationMaster host: N/A | |
ApplicationMaster RPC port: -1 | |
queue: default | |
start time: 1457961772688 | |
final status: UNDEFINED | |
tracking URL: http://sandbox.hortonworks.com:8088/proxy/application_1457961042096_0001/ | |
user: spark | |
16/03/14 13:22:55 INFO Client: Application report for application_1457961042096_0001 (state: ACCEPTED) | |
16/03/14 13:22:56 INFO Client: Application report for application_1457961042096_0001 (state: ACCEPTED) | |
16/03/14 13:22:57 INFO Client: Application report for application_1457961042096_0001 (state: ACCEPTED) | |
16/03/14 13:22:58 INFO Client: Application report for application_1457961042096_0001 (state: ACCEPTED) | |
16/03/14 13:22:59 INFO Client: Application report for application_1457961042096_0001 (state: ACCEPTED) | |
16/03/14 13:23:00 INFO Client: Application report for application_1457961042096_0001 (state: ACCEPTED) | |
16/03/14 13:23:01 INFO Client: Application report for application_1457961042096_0001 (state: ACCEPTED) | |
16/03/14 13:23:01 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(null) | |
16/03/14 13:23:01 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> sandbox.hortonworks.com, PROXY_URI_BASES -> http://sandbox.hortonworks.com:8088/proxy/application_1457961042096_0001), /proxy/application_1457961042096_0001 | |
16/03/14 13:23:01 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter | |
16/03/14 13:23:02 INFO Client: Application report for application_1457961042096_0001 (state: ACCEPTED) | |
16/03/14 13:23:03 INFO Client: Application report for application_1457961042096_0001 (state: RUNNING) | |
16/03/14 13:23:03 INFO Client: | |
client token: N/A | |
diagnostics: N/A | |
ApplicationMaster host: 10.0.2.15 | |
ApplicationMaster RPC port: 0 | |
queue: default | |
start time: 1457961772688 | |
final status: UNDEFINED | |
tracking URL: http://sandbox.hortonworks.com:8088/proxy/application_1457961042096_0001/ | |
user: spark | |
16/03/14 13:23:03 INFO YarnClientSchedulerBackend: Application application_1457961042096_0001 has started running. | |
16/03/14 13:23:03 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 56728. | |
16/03/14 13:23:03 INFO NettyBlockTransferService: Server created on 56728 | |
16/03/14 13:23:03 INFO BlockManagerMaster: Trying to register BlockManager | |
16/03/14 13:23:03 INFO BlockManagerMasterEndpoint: Registering block manager 10.0.2.15:56728 with 143.6 MB RAM, BlockManagerId(driver, 10.0.2.15, 56728) | |
16/03/14 13:23:03 INFO BlockManagerMaster: Registered BlockManager | |
16/03/14 13:23:03 INFO EventLoggingListener: Logging events to hdfs:///spark-history/application_1457961042096_0001 | |
16/03/14 13:23:09 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (sandbox.hortonworks.com:37044) with ID 1 | |
16/03/14 13:23:09 INFO BlockManagerMasterEndpoint: Registering block manager sandbox.hortonworks.com:47362 with 143.6 MB RAM, BlockManagerId(1, sandbox.hortonworks.com, 47362) | |
16/03/14 13:23:19 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms) | |
16/03/14 13:23:19 INFO SparkContext: Starting job: reduce at SparkPi.scala:36 | |
16/03/14 13:23:19 INFO DAGScheduler: Got job 0 (reduce at SparkPi.scala:36) with 10 output partitions | |
16/03/14 13:23:19 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at SparkPi.scala:36) | |
16/03/14 13:23:19 INFO DAGScheduler: Parents of final stage: List() | |
16/03/14 13:23:19 INFO DAGScheduler: Missing parents: List() | |
16/03/14 13:23:19 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:32), which has no missing parents | |
16/03/14 13:23:20 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1888.0 B, free 1888.0 B) | |
16/03/14 13:23:20 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1202.0 B, free 3.0 KB) | |
16/03/14 13:23:20 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.0.2.15:56728 (size: 1202.0 B, free: 143.6 MB) | |
16/03/14 13:23:20 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1006 | |
16/03/14 13:23:20 INFO DAGScheduler: Submitting 10 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:32) | |
16/03/14 13:23:20 INFO YarnScheduler: Adding task set 0.0 with 10 tasks | |
16/03/14 13:23:20 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, sandbox.hortonworks.com, partition 0,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:21 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on sandbox.hortonworks.com:47362 (size: 1202.0 B, free: 143.6 MB) | |
16/03/14 13:23:22 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, sandbox.hortonworks.com, partition 1,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1935 ms on sandbox.hortonworks.com (1/10) | |
16/03/14 13:23:22 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, sandbox.hortonworks.com, partition 2,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 199 ms on sandbox.hortonworks.com (2/10) | |
16/03/14 13:23:22 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, sandbox.hortonworks.com, partition 3,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 58 ms on sandbox.hortonworks.com (3/10) | |
16/03/14 13:23:22 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, sandbox.hortonworks.com, partition 4,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 47 ms on sandbox.hortonworks.com (4/10) | |
16/03/14 13:23:22 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, sandbox.hortonworks.com, partition 5,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 77 ms on sandbox.hortonworks.com (5/10) | |
16/03/14 13:23:22 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, sandbox.hortonworks.com, partition 6,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 41 ms on sandbox.hortonworks.com (6/10) | |
16/03/14 13:23:22 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, sandbox.hortonworks.com, partition 7,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 47 ms on sandbox.hortonworks.com (7/10) | |
16/03/14 13:23:22 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, sandbox.hortonworks.com, partition 8,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 86 ms on sandbox.hortonworks.com (8/10) | |
16/03/14 13:23:22 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, sandbox.hortonworks.com, partition 9,PROCESS_LOCAL, 2176 bytes) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 44 ms on sandbox.hortonworks.com (9/10) | |
16/03/14 13:23:22 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 87 ms on sandbox.hortonworks.com (10/10) | |
16/03/14 13:23:22 INFO YarnScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool | |
16/03/14 13:23:22 INFO DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:36) finished in 2.579 s | |
16/03/14 13:23:22 INFO DAGScheduler: Job 0 finished: reduce at SparkPi.scala:36, took 3.121408 s | |
Pi is roughly 3.143384 | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/metrics/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/api,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/static,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/executors,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/environment,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/rdd,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/storage,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/pool,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/stage,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/stages,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/job,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs/json,null} | |
16/03/14 13:23:22 INFO ContextHandler: stopped o.s.j.s.ServletContextHandler{/jobs,null} | |
16/03/14 13:23:23 INFO SparkUI: Stopped Spark web UI at http://10.0.2.15:4040 | |
16/03/14 13:23:23 INFO YarnClientSchedulerBackend: Shutting down all executors | |
16/03/14 13:23:23 INFO YarnClientSchedulerBackend: Interrupting monitor thread | |
16/03/14 13:23:23 INFO YarnClientSchedulerBackend: Asking each executor to shut down | |
16/03/14 13:23:23 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices | |
(serviceOption=None, | |
services=List(), | |
started=false) | |
16/03/14 13:23:23 INFO YarnClientSchedulerBackend: Stopped | |
16/03/14 13:23:23 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! | |
16/03/14 13:23:23 INFO MemoryStore: MemoryStore cleared | |
16/03/14 13:23:23 INFO BlockManager: BlockManager stopped | |
16/03/14 13:23:23 INFO BlockManagerMaster: BlockManagerMaster stopped | |
16/03/14 13:23:23 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! | |
16/03/14 13:23:23 INFO SparkContext: Successfully stopped SparkContext | |
16/03/14 13:23:23 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. | |
16/03/14 13:23:23 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. | |
16/03/14 13:23:23 INFO ShutdownHookManager: Shutdown hook called | |
16/03/14 13:23:23 INFO ShutdownHookManager: Deleting directory /tmp/spark-fd94b163-9007-4ab7-add6-1f5d8af037ec | |
16/03/14 13:23:23 INFO ShutdownHookManager: Deleting directory /tmp/spark-fd94b163-9007-4ab7-add6-1f5d8af037ec/httpd-7df2e16f-8de5-429e-9631-07655d96f9c5 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment