Skip to content

Instantly share code, notes, and snippets.

@giaosudau
Created June 10, 2016 07:43
Show Gist options
  • Save giaosudau/3dc39b87e379910f726c312936ce4810 to your computer and use it in GitHub Desktop.
Save giaosudau/3dc39b87e379910f726c312936ce4810 to your computer and use it in GitHub Desktop.
spark job server error dynamic allocation
[2016-06-10 14:42:22,137] INFO ocalContextSupervisorActor [] [akka://JobServer/user/context-supervisor] - Creating a SparkContext named sql_context-1
[2016-06-10 14:42:22,138] INFO .jobserver.JobManagerActor [] [akka://JobServer/user/context-supervisor/sql_context-1] - Starting actor spark.jobserver.JobManagerActor
[2016-06-10 14:42:22,140] INFO k.jobserver.JobStatusActor [] [akka://JobServer/user/context-supervisor/sql_context-1/$a] - Starting actor spark.jobserver.JobStatusActor
[2016-06-10 14:42:22,140] INFO k.jobserver.JobResultActor [] [akka://JobServer/user/context-supervisor/sql_context-1/$b] - Starting actor spark.jobserver.JobResultActor
[2016-06-10 14:42:22,144] WARN .apache.spark.SparkContext [] [akka://JobServer/user/context-supervisor/sql_context-1] - Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
spark.jobserver.context.SQLContextFactory$$anon$1.<init>(SQLContextFactory.scala:13)
spark.jobserver.context.SQLContextFactory.makeContext(SQLContextFactory.scala:13)
spark.jobserver.context.SQLContextFactory.makeContext(SQLContextFactory.scala:9)
spark.jobserver.context.SparkContextFactory$class.makeContext(SparkContextFactory.scala:37)
spark.jobserver.context.SQLContextFactory.makeContext(SQLContextFactory.scala:9)
spark.jobserver.JobManagerActor.createContextFromConfig(JobManagerActor.scala:378)
spark.jobserver.JobManagerActor$$anonfun$wrappedReceive$1.applyOrElse(JobManagerActor.scala:122)
scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
ooyala.common.akka.ActorStack$$anonfun$receive$1.applyOrElse(ActorStack.scala:33)
scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25)
ooyala.common.akka.Slf4jLogging$$anonfun$receive$1$$anonfun$applyOrElse$1.apply$mcV$sp(Slf4jLogging.scala:26)
ooyala.common.akka.Slf4jLogging$class.ooyala$common$akka$Slf4jLogging$$withAkkaSourceLogging(Slf4jLogging.scala:35)
ooyala.common.akka.Slf4jLogging$$anonfun$receive$1.applyOrElse(Slf4jLogging.scala:25)
scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33)
scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33)
[2016-06-10 14:42:22,145] INFO .apache.spark.SparkContext [] [akka://JobServer/user/context-supervisor/sql_context-1] - Running Spark version 1.6.1
[2016-06-10 14:42:22,147] INFO ache.spark.SecurityManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - Changing view acls to: root
[2016-06-10 14:42:22,147] INFO ache.spark.SecurityManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - Changing modify acls to: root
[2016-06-10 14:42:22,147] INFO ache.spark.SecurityManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root)
[2016-06-10 14:42:22,163] INFO rg.apache.spark.util.Utils [] [akka://JobServer/user/context-supervisor/sql_context-1] - Successfully started service 'sparkDriver' on port 49701.
[2016-06-10 14:42:22,190] INFO ka.event.slf4j.Slf4jLogger [] [akka://JobServer/user/context-supervisor/sql_context-1] - Slf4jLogger started
[2016-06-10 14:42:22,195] INFO Remoting [] [Remoting] - Starting remoting
[2016-06-10 14:42:22,206] INFO Remoting [] [Remoting] - Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.197.0.3:36236]
[2016-06-10 14:42:22,208] INFO rg.apache.spark.util.Utils [] [akka://JobServer/user/context-supervisor/sql_context-1] - Successfully started service 'sparkDriverActorSystem' on port 36236.
[2016-06-10 14:42:22,209] INFO org.apache.spark.SparkEnv [] [akka://JobServer/user/context-supervisor/sql_context-1] - Registering MapOutputTracker
[2016-06-10 14:42:22,212] INFO org.apache.spark.SparkEnv [] [akka://JobServer/user/context-supervisor/sql_context-1] - Registering BlockManagerMaster
[2016-06-10 14:42:22,213] INFO k.storage.DiskBlockManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - Created local directory at /data/tmp/blockmgr-0f8d5ddf-db7a-490c-ab24-7bcbbc69c353
[2016-06-10 14:42:22,214] INFO .spark.storage.MemoryStore [] [akka://JobServer/user/context-supervisor/sql_context-1] - MemoryStore started with capacity 3.5 GB
[2016-06-10 14:42:22,218] INFO org.apache.spark.SparkEnv [] [akka://JobServer/user/context-supervisor/sql_context-1] - Registering OutputCommitCoordinator
[2016-06-10 14:42:22,269] INFO roject.jetty.server.Server [] [akka://JobServer/user/context-supervisor/sql_context-1] - jetty-8.y.z-SNAPSHOT
[2016-06-10 14:42:22,275] INFO y.server.AbstractConnector [] [akka://JobServer/user/context-supervisor/sql_context-1] - Started SelectChannelConnector@0.0.0.0:9800
[2016-06-10 14:42:22,276] INFO rg.apache.spark.util.Utils [] [akka://JobServer/user/context-supervisor/sql_context-1] - Successfully started service 'SparkUI' on port 9800.
[2016-06-10 14:42:22,276] INFO rg.apache.spark.ui.SparkUI [] [akka://JobServer/user/context-supervisor/sql_context-1] - Started SparkUI at http://10.197.0.3:9800
[2016-06-10 14:42:22,296] INFO pache.spark.HttpFileServer [] [akka://JobServer/user/context-supervisor/sql_context-1] - HTTP File server directory is /data/tmp/spark-9d1c19ab-e588-43d1-b2ee-bef6df8322c0/httpd-6780c2d7-2d6c-4a90-8888-870d2da3e480
[2016-06-10 14:42:22,297] INFO rg.apache.spark.HttpServer [] [akka://JobServer/user/context-supervisor/sql_context-1] - Starting HTTP Server
[2016-06-10 14:42:22,298] INFO roject.jetty.server.Server [] [akka://JobServer/user/context-supervisor/sql_context-1] - jetty-8.y.z-SNAPSHOT
[2016-06-10 14:42:22,300] INFO y.server.AbstractConnector [] [akka://JobServer/user/context-supervisor/sql_context-1] - Started SocketConnector@0.0.0.0:52112
[2016-06-10 14:42:22,301] INFO rg.apache.spark.util.Utils [] [akka://JobServer/user/context-supervisor/sql_context-1] - Successfully started service 'HTTP file server' on port 52112.
[2016-06-10 14:42:22,343] INFO .apache.spark.SparkContext [] [akka://JobServer/user/context-supervisor/sql_context-1] - Added JAR file:/home/spark/job-server/spark-job-server.jar at http://10.197.0.3:52112/jars/spark-job-server.jar with timestamp 1465544542343
[2016-06-10 14:42:22,345] WARN .apache.spark.SparkContext [] [akka://JobServer/user/context-supervisor/sql_context-1] - Master URL for a multi-master Mesos cluster managed by ZooKeeper should be in the form mesos://zk://host:port. Current Master URL will stop working in Spark 2.0.
[2016-06-10 14:42:22,349] INFO ler.FairSchedulableBuilder [] [akka://JobServer/user/context-supervisor/sql_context-1] - Created default pool default, schedulingMode: FIFO, minShare: 0, weight: 1
[2016-06-10 14:42:22,357] INFO oarseMesosSchedulerBackend [] [] - Registered as framework ID 5c98471e-d4d4-43a1-8ea0-57f1ce3c3f4d-0010
[2016-06-10 14:42:22,364] INFO rg.apache.spark.util.Utils [] [akka://JobServer/user/context-supervisor/sql_context-1] - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 34929.
[2016-06-10 14:42:22,365] INFO .NettyBlockTransferService [] [akka://JobServer/user/context-supervisor/sql_context-1] - Server created on 34929
[2016-06-10 14:42:22,369] INFO spark.storage.BlockManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - external shuffle service port = 7337
[2016-06-10 14:42:22,370] INFO storage.BlockManagerMaster [] [akka://JobServer/user/context-supervisor/sql_context-1] - Trying to register BlockManager
[2016-06-10 14:42:22,371] INFO BlockManagerMasterEndpoint [] [akka://JobServer/user/context-supervisor/sql_context-1] - Registering block manager 10.197.0.3:34929 with 3.5 GB RAM, BlockManagerId(driver, 10.197.0.3, 34929)
[2016-06-10 14:42:22,372] INFO storage.BlockManagerMaster [] [akka://JobServer/user/context-supervisor/sql_context-1] - Registered BlockManager
[2016-06-10 14:42:22,406] INFO oarseMesosSchedulerBackend [] [akka://JobServer/user/context-supervisor/sql_context-1] - SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
[2016-06-10 14:42:22,682] INFO ocalContextSupervisorActor [] [] - SparkContext sql_context-1 initialized
[2016-06-10 14:42:27,334] INFO oarseMesosSchedulerBackend [] [] - Mesos task 2 is now TASK_RUNNING
[2016-06-10 14:42:27,382] INFO oarseMesosSchedulerBackend [] [akka://JobServer/user/context-supervisor/sql_context-1] - driver.run() returned with code DRIVER_ABORTED
[2016-06-10 14:42:27,387] INFO .apache.spark.SparkContext [] [akka://JobServer/user/context-supervisor/sql_context-1] - Invoking stop() from shutdown hook
[2016-06-10 14:42:27,410] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/static/sql,null}
[2016-06-10 14:42:27,412] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/SQL/execution/json,null}
[2016-06-10 14:42:27,412] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/SQL/execution,null}
[2016-06-10 14:42:27,412] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/SQL/json,null}
[2016-06-10 14:42:27,413] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/SQL,null}
[2016-06-10 14:42:27,413] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/metrics/json,null}
[2016-06-10 14:42:27,413] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/stages/stage/kill,null}
[2016-06-10 14:42:27,414] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/api,null}
[2016-06-10 14:42:27,415] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/,null}
[2016-06-10 14:42:27,416] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/static,null}
[2016-06-10 14:42:27,416] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/executors/threadDump/json,null}
[2016-06-10 14:42:27,416] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/executors/threadDump,null}
[2016-06-10 14:42:27,417] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/executors/json,null}
[2016-06-10 14:42:27,417] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/executors,null}
[2016-06-10 14:42:27,417] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/environment/json,null}
[2016-06-10 14:42:27,418] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/environment,null}
[2016-06-10 14:42:27,418] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/storage/rdd/json,null}
[2016-06-10 14:42:27,419] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/storage/rdd,null}
[2016-06-10 14:42:27,419] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/storage/json,null}
[2016-06-10 14:42:27,420] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/storage,null}
[2016-06-10 14:42:27,420] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/stages/pool/json,null}
[2016-06-10 14:42:27,420] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/stages/pool,null}
[2016-06-10 14:42:27,420] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/stages/stage/json,null}
[2016-06-10 14:42:27,420] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/stages/stage,null}
[2016-06-10 14:42:27,421] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/stages/json,null}
[2016-06-10 14:42:27,421] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/stages,null}
[2016-06-10 14:42:27,421] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/jobs/job/json,null}
[2016-06-10 14:42:27,421] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/jobs/job,null}
[2016-06-10 14:42:27,421] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/jobs/json,null}
[2016-06-10 14:42:27,421] INFO ver.handler.ContextHandler [] [akka://JobServer/user/context-supervisor/sql_context-1] - stopped o.s.j.s.ServletContextHandler{/jobs,null}
[2016-06-10 14:42:27,473] INFO rg.apache.spark.ui.SparkUI [] [akka://JobServer/user/context-supervisor/sql_context-1] - Stopped Spark web UI at http://10.197.0.3:9800
[2016-06-10 14:42:27,478] INFO oarseMesosSchedulerBackend [] [akka://JobServer/user/context-supervisor/sql_context-1] - Shutting down all executors
[2016-06-10 14:42:27,479] INFO oarseMesosSchedulerBackend [] [akka://JobServer/user/context-supervisor/sql_context-1] - Asking each executor to shut down
[2016-06-10 14:42:27,482] INFO utputTrackerMasterEndpoint [] [akka://JobServer/user/context-supervisor/sql_context-1] - MapOutputTrackerMasterEndpoint stopped!
[2016-06-10 14:42:27,509] INFO .spark.storage.MemoryStore [] [akka://JobServer/user/context-supervisor/sql_context-1] - MemoryStore cleared
[2016-06-10 14:42:27,509] INFO spark.storage.BlockManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - BlockManager stopped
[2016-06-10 14:42:27,509] INFO storage.BlockManagerMaster [] [akka://JobServer/user/context-supervisor/sql_context-1] - BlockManagerMaster stopped
[2016-06-10 14:42:27,510] INFO tCommitCoordinatorEndpoint [] [akka://JobServer/user/context-supervisor/sql_context-1] - OutputCommitCoordinator stopped!
[2016-06-10 14:42:27,512] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriverActorSystem@10.197.0.3:36236/system/remoting-terminator] - Shutting down remote daemon.
[2016-06-10 14:42:27,512] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriverActorSystem@10.197.0.3:36236/system/remoting-terminator] - Remote daemon shut down; proceeding with flushing remote transports.
[2016-06-10 14:42:27,517] INFO rovider$RemotingTerminator [] [akka.tcp://sparkDriverActorSystem@10.197.0.3:36236/system/remoting-terminator] - Remoting shut down.
[2016-06-10 14:42:27,518] INFO .apache.spark.SparkContext [] [akka://JobServer/user/context-supervisor/sql_context-1] - Successfully stopped SparkContext
[2016-06-10 14:42:27,519] INFO k.util.ShutdownHookManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - Shutdown hook called
[2016-06-10 14:42:27,520] INFO k.util.ShutdownHookManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - Deleting directory /data/tmp/spark-9d1c19ab-e588-43d1-b2ee-bef6df8322c0/httpd-e421f7f5-bb33-40a4-960e-a874c6599d9f
[2016-06-10 14:42:27,526] INFO k.util.ShutdownHookManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - Deleting directory /data/tmp/spark-9d1c19ab-e588-43d1-b2ee-bef6df8322c0
[2016-06-10 14:42:27,531] INFO k.util.ShutdownHookManager [] [akka://JobServer/user/context-supervisor/sql_context-1] - Deleting directory /data/tmp/spark-9d1c19ab-e588-43d1-b2ee-bef6df8322c0/httpd-6780c2d7-2d6c-4a90-8888-870d2da3e480
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment