Created
April 13, 2020 07:43
-
-
Save marblejenka/28cb3e14056e72c8d6e540c999516459 to your computer and use it in GitHub Desktop.
SPARK-31432
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[sfuruyam@sfuruyam2 dist]$ ./bin/run-example SparkPi | |
SLF4J: Class path contains multiple SLF4J bindings. | |
SLF4J: Found binding in [jar:file:/home/sfuruyam/dist/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] | |
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.4.0-91/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] | |
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. | |
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] | |
20/04/13 14:38:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
20/04/13 14:38:21 INFO SparkContext: Running Spark version 3.1.0-SNAPSHOT | |
20/04/13 14:38:21 INFO ResourceUtils: ============================================================== | |
20/04/13 14:38:21 INFO ResourceUtils: No custom resources configured for spark.driver. | |
20/04/13 14:38:21 INFO ResourceUtils: ============================================================== | |
20/04/13 14:38:21 INFO SparkContext: Submitted application: Spark Pi | |
20/04/13 14:38:21 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) | |
20/04/13 14:38:21 INFO ResourceProfile: Limiting resource is cpu | |
20/04/13 14:38:21 INFO ResourceProfileManager: Added ResourceProfile id: 0 | |
20/04/13 14:38:21 INFO SecurityManager: Changing view acls to: sfuruyam | |
20/04/13 14:38:21 INFO SecurityManager: Changing modify acls to: sfuruyam | |
20/04/13 14:38:21 INFO SecurityManager: Changing view acls groups to: | |
20/04/13 14:38:21 INFO SecurityManager: Changing modify acls groups to: | |
20/04/13 14:38:21 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(sfuruyam); groups with view permissions: Set(); users with modify permissions: Set(sfuruyam); groups with modify permissions: Set() | |
20/04/13 14:38:21 INFO Utils: Successfully started service 'sparkDriver' on port 46136. | |
20/04/13 14:38:21 INFO SparkEnv: Registering MapOutputTracker | |
20/04/13 14:38:22 INFO SparkEnv: Registering BlockManagerMaster | |
20/04/13 14:38:22 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
20/04/13 14:38:22 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up | |
20/04/13 14:38:22 INFO SparkEnv: Registering BlockManagerMasterHeartbeat | |
20/04/13 14:38:22 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-7f4f69dc-5c69-49c9-8fdc-08ab284f3e13 | |
20/04/13 14:38:22 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB | |
20/04/13 14:38:22 INFO SparkEnv: Registering OutputCommitCoordinator | |
20/04/13 14:38:22 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
20/04/13 14:38:22 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://testing-machine:4040 | |
20/04/13 14:38:22 INFO SparkContext: Added JAR file:///home/sfuruyam/dist/examples/jars/scopt_2.12-3.7.1.jar at spark://testing-machine:46136/jars/scopt_2.12-3.7.1.jar with timestamp 1586756302794 | |
20/04/13 14:38:22 INFO SparkContext: Added JAR file:///home/sfuruyam/dist/examples/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar at spark://testing-machine:46136/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar with timestamp 1586756302795 | |
20/04/13 14:38:23 INFO Executor: Starting executor ID driver on host testing-machine | |
20/04/13 14:38:23 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 42037. | |
20/04/13 14:38:23 INFO NettyBlockTransferService: Server created on testing-machine:42037 | |
20/04/13 14:38:23 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
20/04/13 14:38:23 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, testing-machine, 42037, None) | |
20/04/13 14:38:23 INFO BlockManagerMasterEndpoint: Registering block manager testing-machine:42037 with 366.3 MiB RAM, BlockManagerId(driver, testing-machine, 42037, None) | |
20/04/13 14:38:23 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, testing-machine, 42037, None) | |
20/04/13 14:38:23 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, testing-machine, 42037, None) | |
20/04/13 14:38:24 INFO SparkContext: Starting job: reduce at SparkPi.scala:38 | |
20/04/13 14:38:24 INFO DAGScheduler: Got job 0 (reduce at SparkPi.scala:38) with 2 output partitions | |
20/04/13 14:38:24 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at SparkPi.scala:38) | |
20/04/13 14:38:24 INFO DAGScheduler: Parents of final stage: List() | |
20/04/13 14:38:24 INFO DAGScheduler: Missing parents: List() | |
20/04/13 14:38:24 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents | |
20/04/13 14:38:24 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 3.1 KiB, free 366.3 MiB) | |
20/04/13 14:38:24 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1816.0 B, free 366.3 MiB) | |
20/04/13 14:38:24 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on testing-machine:42037 (size: 1816.0 B, free: 366.3 MiB) | |
20/04/13 14:38:24 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1270 | |
20/04/13 14:38:24 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34) (first 15 tasks are for partitions Vector(0, 1)) | |
20/04/13 14:38:24 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks resource profile 0 | |
20/04/13 14:38:24 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, testing-machine, executor driver, partition 0, PROCESS_LOCAL, 7393 bytes) taskResourceAssignments Map() | |
20/04/13 14:38:24 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, testing-machine, executor driver, partition 1, PROCESS_LOCAL, 7393 bytes) taskResourceAssignments Map() | |
20/04/13 14:38:24 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) | |
20/04/13 14:38:24 INFO Executor: Running task 1.0 in stage 0.0 (TID 1) | |
20/04/13 14:38:24 INFO Executor: Fetching spark://testing-machine:46136/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar with timestamp 1586756302795 | |
20/04/13 14:38:24 INFO TransportClientFactory: Successfully created connection to testing-machine/172.17.42.46:46136 after 60 ms (0 ms spent in bootstraps) | |
20/04/13 14:38:24 INFO Utils: Fetching spark://testing-machine:46136/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar to /tmp/spark-78773d67-9cb0-446f-8dc8-ef2e6f6e031d/userFiles-9d54f354-912f-4f20-b647-c48873645345/fetchFileTemp781259794535970524.tmp | |
20/04/13 14:38:24 INFO Executor: Adding file:/tmp/spark-78773d67-9cb0-446f-8dc8-ef2e6f6e031d/userFiles-9d54f354-912f-4f20-b647-c48873645345/spark-examples_2.12-3.1.0-SNAPSHOT.jar to class loader | |
20/04/13 14:38:24 INFO Executor: Fetching spark://testing-machine:46136/jars/scopt_2.12-3.7.1.jar with timestamp 1586756302794 | |
20/04/13 14:38:24 INFO Utils: Fetching spark://testing-machine:46136/jars/scopt_2.12-3.7.1.jar to /tmp/spark-78773d67-9cb0-446f-8dc8-ef2e6f6e031d/userFiles-9d54f354-912f-4f20-b647-c48873645345/fetchFileTemp8877351177876620352.tmp | |
20/04/13 14:38:24 INFO Executor: Adding file:/tmp/spark-78773d67-9cb0-446f-8dc8-ef2e6f6e031d/userFiles-9d54f354-912f-4f20-b647-c48873645345/scopt_2.12-3.7.1.jar to class loader | |
20/04/13 14:38:25 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1000 bytes result sent to driver | |
20/04/13 14:38:25 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1000 bytes result sent to driver | |
20/04/13 14:38:25 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1058 ms on testing-machine (executor driver) (1/2) | |
20/04/13 14:38:25 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1116 ms on testing-machine (executor driver) (2/2) | |
20/04/13 14:38:25 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool | |
20/04/13 14:38:25 INFO DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:38) finished in 1.442 s | |
20/04/13 14:38:25 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job | |
20/04/13 14:38:25 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished | |
20/04/13 14:38:25 INFO DAGScheduler: Job 0 finished: reduce at SparkPi.scala:38, took 1.523894 s | |
Pi is roughly 3.1467757338786693 | |
20/04/13 14:38:25 INFO SparkUI: Stopped Spark web UI at http://testing-machine:4040 | |
20/04/13 14:38:25 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! | |
20/04/13 14:38:25 INFO MemoryStore: MemoryStore cleared | |
20/04/13 14:38:25 INFO BlockManager: BlockManager stopped | |
20/04/13 14:38:25 INFO BlockManagerMaster: BlockManagerMaster stopped | |
20/04/13 14:38:25 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! | |
20/04/13 14:38:25 INFO SparkContext: Successfully stopped SparkContext | |
20/04/13 14:38:25 INFO ShutdownHookManager: Shutdown hook called | |
20/04/13 14:38:25 INFO ShutdownHookManager: Deleting directory /tmp/spark-bd4f7d09-36d2-4bfb-a32a-eeed31c7e3e3 | |
20/04/13 14:38:25 INFO ShutdownHookManager: Deleting directory /tmp/spark-78773d67-9cb0-446f-8dc8-ef2e6f6e031d | |
[sfuruyam@sfuruyam2 dist]$ mv jars/ jars-moved | |
[sfuruyam@sfuruyam2 dist]$ ./bin/run-example SparkPi | |
Failed to find Spark jars directory (/home/sfuruyam/dist/assembly/target/scala-2.12/jars). | |
You need to set an appropriate directory for SPARK_JARS_DIR or build Spark with the target "package" before running this program. | |
[sfuruyam@sfuruyam2 dist]$ export SPARK_JARS_DIR=$SPARK_HOME/jars-moved | |
[sfuruyam@sfuruyam2 dist]$ ./bin/run-example SparkPi | |
SLF4J: Class path contains multiple SLF4J bindings. | |
SLF4J: Found binding in [jar:file:/home/sfuruyam/dist/jars-moved/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] | |
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.4.0-91/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] | |
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. | |
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] | |
20/04/13 14:38:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
20/04/13 14:38:45 INFO SparkContext: Running Spark version 3.1.0-SNAPSHOT | |
20/04/13 14:38:45 INFO ResourceUtils: ============================================================== | |
20/04/13 14:38:45 INFO ResourceUtils: No custom resources configured for spark.driver. | |
20/04/13 14:38:45 INFO ResourceUtils: ============================================================== | |
20/04/13 14:38:45 INFO SparkContext: Submitted application: Spark Pi | |
20/04/13 14:38:45 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) | |
20/04/13 14:38:45 INFO ResourceProfile: Limiting resource is cpu | |
20/04/13 14:38:45 INFO ResourceProfileManager: Added ResourceProfile id: 0 | |
20/04/13 14:38:45 INFO SecurityManager: Changing view acls to: sfuruyam | |
20/04/13 14:38:45 INFO SecurityManager: Changing modify acls to: sfuruyam | |
20/04/13 14:38:45 INFO SecurityManager: Changing view acls groups to: | |
20/04/13 14:38:45 INFO SecurityManager: Changing modify acls groups to: | |
20/04/13 14:38:45 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(sfuruyam); groups with view permissions: Set(); users with modify permissions: Set(sfuruyam); groups with modify permissions: Set() | |
20/04/13 14:38:46 INFO Utils: Successfully started service 'sparkDriver' on port 41937. | |
20/04/13 14:38:46 INFO SparkEnv: Registering MapOutputTracker | |
20/04/13 14:38:46 INFO SparkEnv: Registering BlockManagerMaster | |
20/04/13 14:38:46 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
20/04/13 14:38:46 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up | |
20/04/13 14:38:46 INFO SparkEnv: Registering BlockManagerMasterHeartbeat | |
20/04/13 14:38:46 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-21c664d5-9d35-4b76-bc95-91dbedb74e52 | |
20/04/13 14:38:46 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB | |
20/04/13 14:38:46 INFO SparkEnv: Registering OutputCommitCoordinator | |
20/04/13 14:38:46 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
20/04/13 14:38:47 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://testing-machine:4040 | |
20/04/13 14:38:47 INFO SparkContext: Added JAR file:///home/sfuruyam/dist/examples/jars/scopt_2.12-3.7.1.jar at spark://testing-machine:41937/jars/scopt_2.12-3.7.1.jar with timestamp 1586756327136 | |
20/04/13 14:38:47 INFO SparkContext: Added JAR file:///home/sfuruyam/dist/examples/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar at spark://testing-machine:41937/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar with timestamp 1586756327137 | |
20/04/13 14:38:47 INFO Executor: Starting executor ID driver on host testing-machine | |
20/04/13 14:38:47 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35334. | |
20/04/13 14:38:47 INFO NettyBlockTransferService: Server created on testing-machine:35334 | |
20/04/13 14:38:47 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
20/04/13 14:38:47 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, testing-machine, 35334, None) | |
20/04/13 14:38:47 INFO BlockManagerMasterEndpoint: Registering block manager testing-machine:35334 with 366.3 MiB RAM, BlockManagerId(driver, testing-machine, 35334, None) | |
20/04/13 14:38:47 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, testing-machine, 35334, None) | |
20/04/13 14:38:47 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, testing-machine, 35334, None) | |
20/04/13 14:38:48 INFO SparkContext: Starting job: reduce at SparkPi.scala:38 | |
20/04/13 14:38:48 INFO DAGScheduler: Got job 0 (reduce at SparkPi.scala:38) with 2 output partitions | |
20/04/13 14:38:48 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at SparkPi.scala:38) | |
20/04/13 14:38:48 INFO DAGScheduler: Parents of final stage: List() | |
20/04/13 14:38:48 INFO DAGScheduler: Missing parents: List() | |
20/04/13 14:38:48 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents | |
20/04/13 14:38:48 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 3.1 KiB, free 366.3 MiB) | |
20/04/13 14:38:48 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1816.0 B, free 366.3 MiB) | |
20/04/13 14:38:48 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on testing-machine:35334 (size: 1816.0 B, free: 366.3 MiB) | |
20/04/13 14:38:48 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1270 | |
20/04/13 14:38:48 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34) (first 15 tasks are for partitions Vector(0, 1)) | |
20/04/13 14:38:48 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks resource profile 0 | |
20/04/13 14:38:48 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, testing-machine, executor driver, partition 0, PROCESS_LOCAL, 7393 bytes) taskResourceAssignments Map() | |
20/04/13 14:38:48 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, testing-machine, executor driver, partition 1, PROCESS_LOCAL, 7393 bytes) taskResourceAssignments Map() | |
20/04/13 14:38:48 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) | |
20/04/13 14:38:48 INFO Executor: Running task 1.0 in stage 0.0 (TID 1) | |
20/04/13 14:38:48 INFO Executor: Fetching spark://testing-machine:41937/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar with timestamp 1586756327137 | |
20/04/13 14:38:49 INFO TransportClientFactory: Successfully created connection to testing-machine/172.17.42.46:41937 after 76 ms (0 ms spent in bootstraps) | |
20/04/13 14:38:49 INFO Utils: Fetching spark://testing-machine:41937/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar to /tmp/spark-c68e2c5f-bb1d-4286-b9d8-dfa828969689/userFiles-bc8b17e9-8fcc-4895-8699-c3c3038052bd/fetchFileTemp8082651768493829704.tmp | |
20/04/13 14:38:49 INFO Executor: Adding file:/tmp/spark-c68e2c5f-bb1d-4286-b9d8-dfa828969689/userFiles-bc8b17e9-8fcc-4895-8699-c3c3038052bd/spark-examples_2.12-3.1.0-SNAPSHOT.jar to class loader | |
20/04/13 14:38:49 INFO Executor: Fetching spark://testing-machine:41937/jars/scopt_2.12-3.7.1.jar with timestamp 1586756327136 | |
20/04/13 14:38:49 INFO Utils: Fetching spark://testing-machine:41937/jars/scopt_2.12-3.7.1.jar to /tmp/spark-c68e2c5f-bb1d-4286-b9d8-dfa828969689/userFiles-bc8b17e9-8fcc-4895-8699-c3c3038052bd/fetchFileTemp7386988972790801680.tmp | |
20/04/13 14:38:49 INFO Executor: Adding file:/tmp/spark-c68e2c5f-bb1d-4286-b9d8-dfa828969689/userFiles-bc8b17e9-8fcc-4895-8699-c3c3038052bd/scopt_2.12-3.7.1.jar to class loader | |
20/04/13 14:38:49 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1000 bytes result sent to driver | |
20/04/13 14:38:49 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1000 bytes result sent to driver | |
20/04/13 14:38:49 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1136 ms on testing-machine (executor driver) (1/2) | |
20/04/13 14:38:49 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1079 ms on testing-machine (executor driver) (2/2) | |
20/04/13 14:38:49 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool | |
20/04/13 14:38:49 INFO DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:38) finished in 1.443 s | |
20/04/13 14:38:49 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job | |
20/04/13 14:38:49 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished | |
20/04/13 14:38:49 INFO DAGScheduler: Job 0 finished: reduce at SparkPi.scala:38, took 1.527886 s | |
Pi is roughly 3.147635738178691 | |
20/04/13 14:38:50 INFO SparkUI: Stopped Spark web UI at http://testing-machine:4040 | |
20/04/13 14:38:50 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! | |
20/04/13 14:38:50 INFO MemoryStore: MemoryStore cleared | |
20/04/13 14:38:50 INFO BlockManager: BlockManager stopped | |
20/04/13 14:38:50 INFO BlockManagerMaster: BlockManagerMaster stopped | |
20/04/13 14:38:50 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! | |
20/04/13 14:38:50 INFO SparkContext: Successfully stopped SparkContext | |
20/04/13 14:38:50 INFO ShutdownHookManager: Shutdown hook called | |
20/04/13 14:38:50 INFO ShutdownHookManager: Deleting directory /tmp/spark-c68e2c5f-bb1d-4286-b9d8-dfa828969689 | |
20/04/13 14:38:50 INFO ShutdownHookManager: Deleting directory /tmp/spark-05926e2d-ada5-4a35-bdf1-201d75af232c |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
C:\Users\sfuruyam\dist>bin\run-example.cmd SparkPi | |
20/04/13 14:55:03 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path | |
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. | |
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:382) | |
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:397) | |
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:390) | |
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80) | |
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2823) | |
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2818) | |
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2684) | |
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373) | |
at org.apache.spark.deploy.DependencyUtils$.resolveGlobPath(DependencyUtils.scala:191) | |
at org.apache.spark.deploy.DependencyUtils$.$anonfun$resolveGlobPaths$2(DependencyUtils.scala:147) | |
at org.apache.spark.deploy.DependencyUtils$.$anonfun$resolveGlobPaths$2$adapted(DependencyUtils.scala:145) | |
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:245) | |
at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36) | |
at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33) | |
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:38) | |
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:245) | |
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:242) | |
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108) | |
at org.apache.spark.deploy.DependencyUtils$.resolveGlobPaths(DependencyUtils.scala:145) | |
at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$4(SparkSubmit.scala:363) | |
at scala.Option.map(Option.scala:230) | |
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:363) | |
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:877) | |
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) | |
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) | |
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) | |
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1013) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1022) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
20/04/13 14:55:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
20/04/13 14:55:04 INFO SparkContext: Running Spark version 3.1.0-SNAPSHOT | |
20/04/13 14:55:04 INFO ResourceUtils: ============================================================== | |
20/04/13 14:55:04 INFO ResourceUtils: No custom resources configured for spark.driver. | |
20/04/13 14:55:04 INFO ResourceUtils: ============================================================== | |
20/04/13 14:55:04 INFO SparkContext: Submitted application: Spark Pi | |
20/04/13 14:55:04 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) | |
20/04/13 14:55:04 INFO ResourceProfile: Limiting resource is cpu | |
20/04/13 14:55:04 INFO ResourceProfileManager: Added ResourceProfile id: 0 | |
20/04/13 14:55:04 INFO SecurityManager: Changing view acls to: sfuruyam | |
20/04/13 14:55:04 INFO SecurityManager: Changing modify acls to: sfuruyam | |
20/04/13 14:55:04 INFO SecurityManager: Changing view acls groups to: | |
20/04/13 14:55:04 INFO SecurityManager: Changing modify acls groups to: | |
20/04/13 14:55:04 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(sfuruyam); groups with view permissions: Set(); users with modify permissions: Set(sfuruyam); groups with modify permissions: Set() | |
20/04/13 14:55:05 INFO Utils: Successfully started service 'sparkDriver' on port 61555. | |
20/04/13 14:55:05 INFO SparkEnv: Registering MapOutputTracker | |
20/04/13 14:55:05 INFO SparkEnv: Registering BlockManagerMaster | |
20/04/13 14:55:05 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
20/04/13 14:55:05 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up | |
20/04/13 14:55:05 INFO SparkEnv: Registering BlockManagerMasterHeartbeat | |
20/04/13 14:55:05 INFO DiskBlockManager: Created local directory at C:\Users\sfuruyam\AppData\Local\Temp\blockmgr-0c9c9ec7-e6f2-4cea-9f30-2d87879d0d7f | |
20/04/13 14:55:05 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB | |
20/04/13 14:55:05 INFO SparkEnv: Registering OutputCommitCoordinator | |
20/04/13 14:55:06 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
20/04/13 14:55:06 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://windows-machine.domain.local:4040 | |
20/04/13 14:55:06 INFO SparkContext: Added JAR file:///C:/Users/sfuruyam/dist/examples/jars/scopt_2.12-3.7.1.jar at spark://windows-machine.domain.local:61555/jars/scopt_2.12-3.7.1.jar with timestamp 1586757306368 | |
20/04/13 14:55:06 INFO SparkContext: Added JAR file:///C:/Users/sfuruyam/dist/examples/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar at spark://windows-machine.domain.local:61555/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar with timestamp 1586757306369 | |
20/04/13 14:55:06 INFO Executor: Starting executor ID driver on host windows-machine.domain.local | |
20/04/13 14:55:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 61598. | |
20/04/13 14:55:06 INFO NettyBlockTransferService: Server created on windows-machine.domain.local:61598 | |
20/04/13 14:55:06 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
20/04/13 14:55:06 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, windows-machine.domain.local, 61598, None) | |
20/04/13 14:55:06 INFO BlockManagerMasterEndpoint: Registering block manager windows-machine.domain.local:61598 with 366.3 MiB RAM, BlockManagerId(driver, windows-machine.domain.local, 61598, None) | |
20/04/13 14:55:06 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, windows-machine.domain.local, 61598, None) | |
20/04/13 14:55:06 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, windows-machine.domain.local, 61598, None) | |
20/04/13 14:55:07 INFO SparkContext: Starting job: reduce at SparkPi.scala:38 | |
20/04/13 14:55:07 INFO DAGScheduler: Got job 0 (reduce at SparkPi.scala:38) with 2 output partitions | |
20/04/13 14:55:07 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at SparkPi.scala:38) | |
20/04/13 14:55:07 INFO DAGScheduler: Parents of final stage: List() | |
20/04/13 14:55:07 INFO DAGScheduler: Missing parents: List() | |
20/04/13 14:55:07 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents | |
20/04/13 14:55:07 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 3.1 KiB, free 366.3 MiB) | |
20/04/13 14:55:07 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1816.0 B, free 366.3 MiB) | |
20/04/13 14:55:07 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on windows-machine.domain.local:61598 (size: 1816.0 B, free: 366.3 MiB) | |
20/04/13 14:55:07 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1270 | |
20/04/13 14:55:07 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34) (first 15 tasks are for partitions Vector(0, 1)) | |
20/04/13 14:55:07 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks resource profile 0 | |
20/04/13 14:55:07 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, windows-machine.domain.local, executor driver, partition 0, PROCESS_LOCAL, 7393 bytes) taskResourceAssignments Map() | |
20/04/13 14:55:07 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, windows-machine.domain.local, executor driver, partition 1, PROCESS_LOCAL, 7393 bytes) taskResourceAssignments Map() | |
20/04/13 14:55:07 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) | |
20/04/13 14:55:07 INFO Executor: Running task 1.0 in stage 0.0 (TID 1) | |
20/04/13 14:55:07 INFO Executor: Fetching spark://windows-machine.domain.local:61555/jars/scopt_2.12-3.7.1.jar with timestamp 1586757306368 | |
20/04/13 14:55:08 INFO TransportClientFactory: Successfully created connection to windows-machine.domain.local/10.10.138.148:61555 after 49 ms (0 ms spent in bootstraps) | |
20/04/13 14:55:08 INFO Utils: Fetching spark://windows-machine.domain.local:61555/jars/scopt_2.12-3.7.1.jar to C:\Users\sfuruyam\AppData\Local\Temp\spark-481e2a32-f8ca-4293-9cc6-37a2e845b3a8\userFiles-282a9559-ed81-419b-8592-fbbf9f040a7f\fetchFileTemp5221711423608197831.tmp | |
20/04/13 14:55:08 INFO Executor: Adding file:/C:/Users/sfuruyam/AppData/Local/Temp/spark-481e2a32-f8ca-4293-9cc6-37a2e845b3a8/userFiles-282a9559-ed81-419b-8592-fbbf9f040a7f/scopt_2.12-3.7.1.jar to class loader | |
20/04/13 14:55:08 INFO Executor: Fetching spark://windows-machine.domain.local:61555/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar with timestamp 1586757306369 | |
20/04/13 14:55:08 INFO Utils: Fetching spark://windows-machine.domain.local:61555/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar to C:\Users\sfuruyam\AppData\Local\Temp\spark-481e2a32-f8ca-4293-9cc6-37a2e845b3a8\userFiles-282a9559-ed81-419b-8592-fbbf9f040a7f\fetchFileTemp7571198370407628396.tmp | |
20/04/13 14:55:08 INFO Executor: Adding file:/C:/Users/sfuruyam/AppData/Local/Temp/spark-481e2a32-f8ca-4293-9cc6-37a2e845b3a8/userFiles-282a9559-ed81-419b-8592-fbbf9f040a7f/spark-examples_2.12-3.1.0-SNAPSHOT.jar to class loader | |
20/04/13 14:55:08 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1000 bytes result sent to driver | |
20/04/13 14:55:08 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1000 bytes result sent to driver | |
20/04/13 14:55:08 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 890 ms on windows-machine.domain.local (executor driver) (1/2) | |
20/04/13 14:55:08 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 926 ms on windows-machine.domain.local (executor driver) (2/2) | |
20/04/13 14:55:08 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool | |
20/04/13 14:55:08 INFO DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:38) finished in 1.327 s | |
20/04/13 14:55:08 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job | |
20/04/13 14:55:08 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished | |
20/04/13 14:55:08 INFO DAGScheduler: Job 0 finished: reduce at SparkPi.scala:38, took 1.503580 s | |
Pi is roughly 3.1424157120785603 | |
20/04/13 14:55:08 INFO SparkUI: Stopped Spark web UI at http://windows-machine.domain.local:4040 | |
20/04/13 14:55:08 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! | |
20/04/13 14:55:09 INFO MemoryStore: MemoryStore cleared | |
20/04/13 14:55:09 INFO BlockManager: BlockManager stopped | |
20/04/13 14:55:09 INFO BlockManagerMaster: BlockManagerMaster stopped | |
20/04/13 14:55:09 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! | |
20/04/13 14:55:09 INFO SparkContext: Successfully stopped SparkContext | |
20/04/13 14:55:09 INFO ShutdownHookManager: Shutdown hook called | |
20/04/13 14:55:09 INFO ShutdownHookManager: Deleting directory C:\Users\sfuruyam\AppData\Local\Temp\spark-481e2a32-f8ca-4293-9cc6-37a2e845b3a8 | |
20/04/13 14:55:09 INFO ShutdownHookManager: Deleting directory C:\Users\sfuruyam\AppData\Local\Temp\spark-7d346241-b106-463d-9935-ec1676d42335 | |
C:\Users\sfuruyam\dist>move jars jars-moved | |
1 個のディレクトリを移動しました。 | |
C:\Users\sfuruyam\dist>bin\run-example.cmd SparkPi | |
Failed to find Spark jars directory. | |
You need to set an appropriate directory for SPARK_JARS_DIR or build Spark before running this program. | |
C:\Users\sfuruyam\dist>set SPARK_JARS_DIR=C:\Users\sfuruyam\dist\jars-moved | |
C:\Users\sfuruyam\dist>bin\run-example.cmd SparkPi | |
20/04/13 14:55:54 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path | |
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. | |
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:382) | |
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:397) | |
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:390) | |
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:80) | |
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2823) | |
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2818) | |
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2684) | |
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373) | |
at org.apache.spark.deploy.DependencyUtils$.resolveGlobPath(DependencyUtils.scala:191) | |
at org.apache.spark.deploy.DependencyUtils$.$anonfun$resolveGlobPaths$2(DependencyUtils.scala:147) | |
at org.apache.spark.deploy.DependencyUtils$.$anonfun$resolveGlobPaths$2$adapted(DependencyUtils.scala:145) | |
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:245) | |
at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36) | |
at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33) | |
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:38) | |
at scala.collection.TraversableLike.flatMap(TraversableLike.scala:245) | |
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:242) | |
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108) | |
at org.apache.spark.deploy.DependencyUtils$.resolveGlobPaths(DependencyUtils.scala:145) | |
at org.apache.spark.deploy.SparkSubmit.$anonfun$prepareSubmitEnvironment$4(SparkSubmit.scala:363) | |
at scala.Option.map(Option.scala:230) | |
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:363) | |
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:877) | |
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) | |
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) | |
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) | |
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1013) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1022) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
20/04/13 14:55:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
20/04/13 14:55:55 INFO SparkContext: Running Spark version 3.1.0-SNAPSHOT | |
20/04/13 14:55:55 INFO ResourceUtils: ============================================================== | |
20/04/13 14:55:55 INFO ResourceUtils: No custom resources configured for spark.driver. | |
20/04/13 14:55:55 INFO ResourceUtils: ============================================================== | |
20/04/13 14:55:55 INFO SparkContext: Submitted application: Spark Pi | |
20/04/13 14:55:55 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) | |
20/04/13 14:55:55 INFO ResourceProfile: Limiting resource is cpu | |
20/04/13 14:55:55 INFO ResourceProfileManager: Added ResourceProfile id: 0 | |
20/04/13 14:55:55 INFO SecurityManager: Changing view acls to: sfuruyam | |
20/04/13 14:55:55 INFO SecurityManager: Changing modify acls to: sfuruyam | |
20/04/13 14:55:55 INFO SecurityManager: Changing view acls groups to: | |
20/04/13 14:55:55 INFO SecurityManager: Changing modify acls groups to: | |
20/04/13 14:55:55 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(sfuruyam); groups with view permissions: Set(); users with modify permissions: Set(sfuruyam); groups with modify permissions: Set() | |
20/04/13 14:55:57 INFO Utils: Successfully started service 'sparkDriver' on port 61644. | |
20/04/13 14:55:57 INFO SparkEnv: Registering MapOutputTracker | |
20/04/13 14:55:57 INFO SparkEnv: Registering BlockManagerMaster | |
20/04/13 14:55:57 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
20/04/13 14:55:57 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up | |
20/04/13 14:55:57 INFO SparkEnv: Registering BlockManagerMasterHeartbeat | |
20/04/13 14:55:57 INFO DiskBlockManager: Created local directory at C:\Users\sfuruyam\AppData\Local\Temp\blockmgr-3a062742-e44b-404e-b5be-ff76a732f2f7 | |
20/04/13 14:55:57 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB | |
20/04/13 14:55:57 INFO SparkEnv: Registering OutputCommitCoordinator | |
20/04/13 14:55:57 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
20/04/13 14:55:57 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://windows-machine.domain.local:4040 | |
20/04/13 14:55:57 INFO SparkContext: Added JAR file:///C:/Users/sfuruyam/dist/examples/jars/scopt_2.12-3.7.1.jar at spark://windows-machine.domain.local:61644/jars/scopt_2.12-3.7.1.jar with timestamp 1586757357610 | |
20/04/13 14:55:57 INFO SparkContext: Added JAR file:///C:/Users/sfuruyam/dist/examples/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar at spark://windows-machine.domain.local:61644/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar with timestamp 1586757357610 | |
20/04/13 14:55:57 INFO Executor: Starting executor ID driver on host windows-machine.domain.local | |
20/04/13 14:55:57 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 61687. | |
20/04/13 14:55:57 INFO NettyBlockTransferService: Server created on windows-machine.domain.local:61687 | |
20/04/13 14:55:57 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
20/04/13 14:55:57 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, windows-machine.domain.local, 61687, None) | |
20/04/13 14:55:57 INFO BlockManagerMasterEndpoint: Registering block manager windows-machine.domain.local:61687 with 366.3 MiB RAM, BlockManagerId(driver, windows-machine.domain.local, 61687, None) | |
20/04/13 14:55:57 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, windows-machine.domain.local, 61687, None) | |
20/04/13 14:55:57 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, windows-machine.domain.local, 61687, None) | |
20/04/13 14:55:58 INFO SparkContext: Starting job: reduce at SparkPi.scala:38 | |
20/04/13 14:55:58 INFO DAGScheduler: Got job 0 (reduce at SparkPi.scala:38) with 2 output partitions | |
20/04/13 14:55:58 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at SparkPi.scala:38) | |
20/04/13 14:55:58 INFO DAGScheduler: Parents of final stage: List() | |
20/04/13 14:55:58 INFO DAGScheduler: Missing parents: List() | |
20/04/13 14:55:58 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34), which has no missing parents | |
20/04/13 14:55:58 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 3.1 KiB, free 366.3 MiB) | |
20/04/13 14:55:58 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1816.0 B, free 366.3 MiB) | |
20/04/13 14:55:58 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on windows-machine.domain.local:61687 (size: 1816.0 B, free: 366.3 MiB) | |
20/04/13 14:55:58 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1270 | |
20/04/13 14:55:58 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at SparkPi.scala:34) (first 15 tasks are for partitions Vector(0, 1)) | |
20/04/13 14:55:58 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks resource profile 0 | |
20/04/13 14:55:59 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, windows-machine.domain.local, executor driver, partition 0, PROCESS_LOCAL, 7393 bytes) taskResourceAssignments Map() | |
20/04/13 14:55:59 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, windows-machine.domain.local, executor driver, partition 1, PROCESS_LOCAL, 7393 bytes) taskResourceAssignments Map() | |
20/04/13 14:55:59 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) | |
20/04/13 14:55:59 INFO Executor: Running task 1.0 in stage 0.0 (TID 1) | |
20/04/13 14:55:59 INFO Executor: Fetching spark://windows-machine.domain.local:61644/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar with timestamp 1586757357610 | |
20/04/13 14:55:59 INFO TransportClientFactory: Successfully created connection to windows-machine.domain.local/10.10.138.148:61644 after 38 ms (0 ms spent in bootstraps) | |
20/04/13 14:55:59 INFO Utils: Fetching spark://windows-machine.domain.local:61644/jars/spark-examples_2.12-3.1.0-SNAPSHOT.jar to C:\Users\sfuruyam\AppData\Local\Temp\spark-6c3d1591-241d-42d4-a03c-19507beb0b3a\userFiles-faf75c09-477d-45a6-8e39-db1f1713db48\fetchFileTemp716720126225905657.tmp | |
20/04/13 14:55:59 INFO Executor: Adding file:/C:/Users/sfuruyam/AppData/Local/Temp/spark-6c3d1591-241d-42d4-a03c-19507beb0b3a/userFiles-faf75c09-477d-45a6-8e39-db1f1713db48/spark-examples_2.12-3.1.0-SNAPSHOT.jar to class loader | |
20/04/13 14:55:59 INFO Executor: Fetching spark://windows-machine.domain.local:61644/jars/scopt_2.12-3.7.1.jar with timestamp 1586757357610 | |
20/04/13 14:55:59 INFO Utils: Fetching spark://windows-machine.domain.local:61644/jars/scopt_2.12-3.7.1.jar to C:\Users\sfuruyam\AppData\Local\Temp\spark-6c3d1591-241d-42d4-a03c-19507beb0b3a\userFiles-faf75c09-477d-45a6-8e39-db1f1713db48\fetchFileTemp6299280991930314176.tmp | |
20/04/13 14:55:59 INFO Executor: Adding file:/C:/Users/sfuruyam/AppData/Local/Temp/spark-6c3d1591-241d-42d4-a03c-19507beb0b3a/userFiles-faf75c09-477d-45a6-8e39-db1f1713db48/scopt_2.12-3.7.1.jar to class loader | |
20/04/13 14:55:59 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1000 bytes result sent to driver | |
20/04/13 14:55:59 INFO Executor: Finished task 1.0 in stage 0.0 (TID 1). 1000 bytes result sent to driver | |
20/04/13 14:56:00 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 993 ms on windows-machine.domain.local (executor driver) (1/2) | |
20/04/13 14:56:00 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 965 ms on windows-machine.domain.local (executor driver) (2/2) | |
20/04/13 14:56:00 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool | |
20/04/13 14:56:00 INFO DAGScheduler: ResultStage 0 (reduce at SparkPi.scala:38) finished in 1.315 s | |
20/04/13 14:56:00 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job | |
20/04/13 14:56:00 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished | |
20/04/13 14:56:00 INFO DAGScheduler: Job 0 finished: reduce at SparkPi.scala:38, took 1.439560 s | |
Pi is roughly 3.1428557142785714 | |
20/04/13 14:56:00 INFO SparkUI: Stopped Spark web UI at http://windows-machine.domain.local:4040 | |
20/04/13 14:56:00 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! | |
20/04/13 14:56:00 INFO MemoryStore: MemoryStore cleared | |
20/04/13 14:56:00 INFO BlockManager: BlockManager stopped | |
20/04/13 14:56:00 INFO BlockManagerMaster: BlockManagerMaster stopped | |
20/04/13 14:56:00 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! | |
20/04/13 14:56:00 INFO SparkContext: Successfully stopped SparkContext | |
20/04/13 14:56:00 INFO ShutdownHookManager: Shutdown hook called | |
20/04/13 14:56:00 INFO ShutdownHookManager: Deleting directory C:\Users\sfuruyam\AppData\Local\Temp\spark-6c3d1591-241d-42d4-a03c-19507beb0b3a | |
20/04/13 14:56:00 INFO ShutdownHookManager: Deleting directory C:\Users\sfuruyam\AppData\Local\Temp\spark-e5b5a74d-f512-49f7-85b8-05d8951a6cdd | |
C:\Users\sfuruyam\dist> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[sfuruyam@sfuruyam2 dist]$ ./sbin/start-master.sh | |
starting org.apache.spark.deploy.master.Master, logging to /home/sfuruyam/dist/logs/spark-sfuruyam-org.apache.spark.deploy.master.Master-1-testing-machine.out | |
[sfuruyam@sfuruyam2 dist]$ jps | grep Master | |
2159 Master | |
[sfuruyam@sfuruyam2 dist]$ kill -kill 2159 | |
[sfuruyam@sfuruyam2 dist]$ mv jars/ jars-moved | |
[sfuruyam@sfuruyam2 dist]$ ./sbin/start-master.sh | |
starting org.apache.spark.deploy.master.Master, logging to /home/sfuruyam/dist/logs/spark-sfuruyam-org.apache.spark.deploy.master.Master-1-testing-machine.out | |
failed to launch: nice -n 0 /home/sfuruyam/dist/bin/spark-class org.apache.spark.deploy.master.Master --host testing-machine --port 7077 --webui-port 8080 | |
Failed to find Spark jars directory (/home/sfuruyam/dist/jars). | |
You need to set an appropriate directory for SPARK_JARS_DIR or build Spark with the target "package" before running this program. | |
full log in /home/sfuruyam/dist/logs/spark-sfuruyam-org.apache.spark.deploy.master.Master-1-testing-machine.out | |
[sfuruyam@sfuruyam2 dist]$ jps | grep Master | |
[sfuruyam@sfuruyam2 dist]$ export SPARK_JARS_DIR=/home/sfuruyam/dist/jars-moved | |
[sfuruyam@sfuruyam2 dist]$ ./sbin/start-master.sh | |
starting org.apache.spark.deploy.master.Master, logging to /home/sfuruyam/dist/logs/spark-sfuruyam-org.apache.spark.deploy.master.Master-1-testing-machine.out | |
[sfuruyam@sfuruyam2 dist]$ jps | grep Master | |
5702 Master |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment