Created
June 29, 2020 22:34
-
-
Save orwa-te/0de487552828f721ac1f45d63d27e75f to your computer and use it in GitHub Desktop.
Stack trace of error when setMaster(<url>) is set in Jupyter Notebook code
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Spark Executor Command: "/usr/lib/jvm/java-1.8.0-openjdk-amd64/bin/java" "-cp" "/home/orwa/spark/conf/:/home/orwa/spark/jars/*" "-Xmx1024M" "-Dspark.driver.port=37501" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@master:37501" "--executor-id" "0" "--hostname" "192.168.198.131" "--cores" "2" "--app-id" "app-20200630012803-0001" "--worker-url" "spark://Worker@192.168.198.131:37685" | |
======================================== | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
20/06/30 01:28:39 INFO CoarseGrainedExecutorBackend: Started daemon with process name: 8519@orwa-virtual-machine | |
20/06/30 01:28:39 INFO SignalUtils: Registered signal handler for TERM | |
20/06/30 01:28:39 INFO SignalUtils: Registered signal handler for HUP | |
20/06/30 01:28:39 INFO SignalUtils: Registered signal handler for INT | |
20/06/30 01:28:39 WARN Utils: Your hostname, orwa-virtual-machine resolves to a loopback address: 127.0.1.1; using 192.168.198.131 instead (on interface ens33) | |
20/06/30 01:28:39 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address | |
20/06/30 01:28:40 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
20/06/30 01:28:40 INFO SecurityManager: Changing view acls to: orwa | |
20/06/30 01:28:40 INFO SecurityManager: Changing modify acls to: orwa | |
20/06/30 01:28:40 INFO SecurityManager: Changing view acls groups to: | |
20/06/30 01:28:40 INFO SecurityManager: Changing modify acls groups to: | |
20/06/30 01:28:40 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(orwa); groups with view permissions: Set(); users with modify permissions: Set(orwa); groups with modify permissions: Set() | |
20/06/30 01:28:40 INFO TransportClientFactory: Successfully created connection to master/192.168.198.131:37501 after 57 ms (0 ms spent in bootstraps) | |
20/06/30 01:28:40 INFO SecurityManager: Changing view acls to: orwa | |
20/06/30 01:28:40 INFO SecurityManager: Changing modify acls to: orwa | |
20/06/30 01:28:40 INFO SecurityManager: Changing view acls groups to: | |
20/06/30 01:28:40 INFO SecurityManager: Changing modify acls groups to: | |
20/06/30 01:28:40 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(orwa); groups with view permissions: Set(); users with modify permissions: Set(orwa); groups with modify permissions: Set() | |
20/06/30 01:28:40 INFO TransportClientFactory: Successfully created connection to master/192.168.198.131:37501 after 4 ms (0 ms spent in bootstraps) | |
20/06/30 01:28:40 INFO DiskBlockManager: Created local directory at /tmp/spark-411f1d1d-df0b-4044-a46e-f2ae645ce956/executor-4e554c6d-6003-4763-a73b-4a41320d543a/blockmgr-ab6a03a3-503c-44e2-9b3b-7b726e1f6960 | |
20/06/30 01:28:40 INFO MemoryStore: MemoryStore started with capacity 366.3 MB | |
20/06/30 01:28:41 INFO CoarseGrainedExecutorBackend: Connecting to driver: spark://CoarseGrainedScheduler@master:37501 | |
20/06/30 01:28:41 INFO WorkerWatcher: Connecting to worker spark://Worker@192.168.198.131:37685 | |
20/06/30 01:28:41 INFO WorkerWatcher: Successfully connected to spark://Worker@192.168.198.131:37685 | |
20/06/30 01:28:41 INFO TransportClientFactory: Successfully created connection to /192.168.198.131:37685 after 22 ms (0 ms spent in bootstraps) | |
20/06/30 01:28:41 INFO CoarseGrainedExecutorBackend: Successfully registered with driver | |
20/06/30 01:28:41 INFO Executor: Starting executor ID 0 on host 192.168.198.131 | |
20/06/30 01:28:41 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45067. | |
20/06/30 01:28:41 INFO NettyBlockTransferService: Server created on 192.168.198.131:45067 | |
20/06/30 01:28:41 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
20/06/30 01:28:41 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(0, 192.168.198.131, 45067, None) | |
20/06/30 01:28:41 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(0, 192.168.198.131, 45067, None) | |
20/06/30 01:28:41 INFO BlockManager: Initialized BlockManager: BlockManagerId(0, 192.168.198.131, 45067, None) | |
20/06/30 01:28:41 INFO CoarseGrainedExecutorBackend: Got assigned task 0 | |
20/06/30 01:28:41 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) | |
20/06/30 01:28:41 INFO TorrentBroadcast: Started reading broadcast variable 0 | |
20/06/30 01:28:41 INFO TransportClientFactory: Successfully created connection to master/192.168.198.131:34629 after 2 ms (0 ms spent in bootstraps) | |
20/06/30 01:28:41 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 14.8 KB, free 366.3 MB) | |
20/06/30 01:28:41 INFO TorrentBroadcast: Reading broadcast variable 0 took 111 ms | |
20/06/30 01:28:41 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 20.7 KB, free 366.3 MB) | |
20/06/30 01:28:41 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0) | |
java.io.IOException: Cannot run program "python": error=2, No such file or directory | |
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048) | |
at org.apache.spark.api.python.PythonWorkerFactory.startDaemon(PythonWorkerFactory.scala:197) | |
at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:122) | |
at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:95) | |
at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:117) | |
at org.apache.spark.api.python.BasePythonRunner.compute(PythonRunner.scala:109) | |
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:65) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.io.IOException: error=2, No such file or directory | |
at java.lang.UNIXProcess.forkAndExec(Native Method) | |
at java.lang.UNIXProcess.<init>(UNIXProcess.java:247) | |
at java.lang.ProcessImpl.start(ProcessImpl.java:134) | |
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) | |
... 16 more | |
20/06/30 01:28:41 INFO CoarseGrainedExecutorBackend: Got assigned task 1 | |
20/06/30 01:28:41 INFO Executor: Running task 0.1 in stage 0.0 (TID 1) | |
20/06/30 01:28:41 ERROR Executor: Exception in task 0.1 in stage 0.0 (TID 1) | |
java.io.IOException: Cannot run program "python": error=2, No such file or directory | |
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048) | |
at org.apache.spark.api.python.PythonWorkerFactory.startDaemon(PythonWorkerFactory.scala:197) | |
at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:122) | |
at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:95) | |
at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:117) | |
at org.apache.spark.api.python.BasePythonRunner.compute(PythonRunner.scala:109) | |
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:65) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.io.IOException: error=2, No such file or directory | |
at java.lang.UNIXProcess.forkAndExec(Native Method) | |
at java.lang.UNIXProcess.<init>(UNIXProcess.java:247) | |
at java.lang.ProcessImpl.start(ProcessImpl.java:134) | |
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) | |
... 16 more | |
20/06/30 01:28:41 INFO CoarseGrainedExecutorBackend: Got assigned task 2 | |
20/06/30 01:28:41 INFO Executor: Running task 0.2 in stage 0.0 (TID 2) | |
20/06/30 01:28:41 ERROR Executor: Exception in task 0.2 in stage 0.0 (TID 2) | |
java.io.IOException: Cannot run program "python": error=2, No such file or directory | |
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048) | |
at org.apache.spark.api.python.PythonWorkerFactory.startDaemon(PythonWorkerFactory.scala:197) | |
at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:122) | |
at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:95) | |
at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:117) | |
at org.apache.spark.api.python.BasePythonRunner.compute(PythonRunner.scala:109) | |
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:65) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.io.IOException: error=2, No such file or directory | |
at java.lang.UNIXProcess.forkAndExec(Native Method) | |
at java.lang.UNIXProcess.<init>(UNIXProcess.java:247) | |
at java.lang.ProcessImpl.start(ProcessImpl.java:134) | |
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) | |
... 16 more | |
20/06/30 01:28:41 INFO CoarseGrainedExecutorBackend: Got assigned task 3 | |
20/06/30 01:28:41 INFO Executor: Running task 0.3 in stage 0.0 (TID 3) | |
20/06/30 01:28:41 ERROR Executor: Exception in task 0.3 in stage 0.0 (TID 3) | |
java.io.IOException: Cannot run program "python": error=2, No such file or directory | |
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048) | |
at org.apache.spark.api.python.PythonWorkerFactory.startDaemon(PythonWorkerFactory.scala:197) | |
at org.apache.spark.api.python.PythonWorkerFactory.createThroughDaemon(PythonWorkerFactory.scala:122) | |
at org.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:95) | |
at org.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:117) | |
at org.apache.spark.api.python.BasePythonRunner.compute(PythonRunner.scala:109) | |
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:65) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) | |
at org.apache.spark.scheduler.Task.run(Task.scala:123) | |
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) | |
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.io.IOException: error=2, No such file or directory | |
at java.lang.UNIXProcess.forkAndExec(Native Method) | |
at java.lang.UNIXProcess.<init>(UNIXProcess.java:247) | |
at java.lang.ProcessImpl.start(ProcessImpl.java:134) | |
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029) | |
... 16 more | |
20/06/30 01:28:42 INFO CoarseGrainedExecutorBackend: Driver commanded a shutdown | |
20/06/30 01:28:42 ERROR CoarseGrainedExecutorBackend: RECEIVED SIGNAL TERM | |
tdown |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment