Skip to content

Instantly share code, notes, and snippets.

@wnina
Last active March 29, 2016 17:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save wnina/151116ede3e2d5a7b151 to your computer and use it in GitHub Desktop.
Save wnina/151116ede3e2d5a7b151 to your computer and use it in GitHub Desktop.
error] 16/03/29 10:56:18 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
[error] 16/03/29 10:56:18 INFO AppClient$ClientEndpoint: Connecting to master spark://192.168.1.102:7077...
[error] 16/03/29 10:56:38 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[appclient-registration-retry-thread,5,main]
[error] java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.FutureTask@34ff1cc0 rejected from java.util.concurrent.ThreadPoolExecutor@58b89c6e[Running, pool size = 1, active threads = 0, queued tasks = 0, completed tasks = 1]
[error] at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2048)
[error] at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:821)
[error] at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1372)
[error] at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)
[error] at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:96)
[error] at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1.apply(AppClient.scala:95)
[error] at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
[error] at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
[error] at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
[error] at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
[error] at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
[error] at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:108)
[error] at org.apache.spark.deploy.client.AppClient$ClientEndpoint.tryRegisterAllMasters(AppClient.scala:95)
[error] at org.apache.spark.deploy.client.AppClient$ClientEndpoint.org$apache$spark$deploy$client$AppClient$ClientEndpoint$$registerWithMaster(AppClient.scala:121)
[error] at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2$$anonfun$run$1.apply$mcV$sp(AppClient.scala:132)
[error] at org.apache.spark.util.Utils$.tryOrExit(Utils.scala:1119)
[error] at org.apache.spark.deploy.client.AppClient$ClientEndpoint$$anon$2.run(AppClient.scala:124)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
[error] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
[error] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
[error] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[error] at java.lang.Thread.run(Thread.java:745)
[error] 16/03/29 10:56:38 INFO DiskBlockManager: Shutdown hook called
[error] 16/03/29 10:56:38 INFO ShutdownHookManager: Shutdown hook called
[error] 16/03/29 10:56:38 INFO ShutdownHookManager: Deleting directory /tmp/spark-e5c6a814-54b3-4565-b104-1c8829f3449b
java.lang.RuntimeException: Nonzero exit code returned from runner: 50
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code returned from runner: 50
[error] Total time: 24 s, completed Mar 29, 2016 10:56:38 AM
@wnina
Copy link
Author

wnina commented Mar 29, 2016

The content of spark-env.sh is:

export SPARK_MASTER_IP=192.168.1.102

@wnina
Copy link
Author

wnina commented Mar 29, 2016

Thanks also, i resolved it. I only should to have the version of client and server of spark on geotrellis the same.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment