Created
May 11, 2016 11:12
-
-
Save geoHeil/946dd7706f44f338101c8332f4e13c1a to your computer and use it in GitHub Desktop.
sbt run spark exception
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
16/05/11 13:01:28 INFO DAGScheduler: ResultStage 29 (map at DrilldownArtist.scala:337) finished in 2,172 s | |
16/05/11 13:01:28 INFO DAGScheduler: Job 13 finished: map at DrilldownArtist.scala:337, took 38,416625 s | |
[error] (run-main-0) org.apache.spark.SparkException: Job aborted due to stage failure: Task 697 in stage 1.0 failed 1 times, most recent failure: Lost task 697.0 in stage 1.0 (TID 705, localhost): ExecutorLostFailure (executor driver exited caused by one of the running tasks) Reason: Executor heartbeat timed out after 156451 ms | |
[error] Driver stacktrace: | |
16/05/11 13:01:28 INFO MapOutputTrackerMaster: Size of output statuses for shuffle 12 is 14186 bytes | |
org.apache.spark.SparkException: Job aborted due to stage failure: Task 697 in stage 1.0 failed 1 times, most recent failure: Lost task 697.0 in stage 1.0 (TID 705, localhost): ExecutorLostFailure (executor driver exited caused by one of the running tasks) Reason: Executor heartbeat timed out after 156451 ms | |
Driver stacktrace: | |
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431) | |
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419) | |
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418) | |
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) | |
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) | |
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418) | |
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799) | |
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799) | |
at scala.Option.foreach(Option.scala:236) | |
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799) | |
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640) | |
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599) | |
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588) | |
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) | |
[trace] Stack trace suppressed: run last compile:run for the full output. | |
16/05/11 13:01:28 INFO DAGScheduler: Registering RDD 111 (map at DrilldownArtist.scala:337) | |
16/05/11 13:01:28 INFO DAGScheduler: Got job 14 (collectAsync at DrilldownArtist.scala:337) with 179 output partitions | |
16/05/11 13:01:28 INFO DAGScheduler: Final stage: ResultStage 32 (collectAsync at DrilldownArtist.scala:337) | |
16/05/11 13:01:28 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 31) | |
16/05/11 13:01:28 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 31) | |
16/05/11 13:01:28 INFO DAGScheduler: Submitting ShuffleMapStage 31 (MapPartitionsRDD[111] at map at DrilldownArtist.scala:337), which has no missing parents | |
16/05/11 13:01:28 INFO MemoryStore: Block broadcast_26 stored as values in memory (estimated size 23.1 KB, free 68.4 MB) | |
16/05/11 13:01:28 INFO MemoryStore: Block broadcast_26_piece0 stored as bytes in memory (estimated size 9.9 KB, free 68.4 MB) | |
16/05/11 13:01:28 INFO BlockManagerInfo: Added broadcast_26_piece0 in memory on localhost:56129 (size: 9.9 KB, free: 412.5 MB) | |
16/05/11 13:01:28 INFO SparkContext: Created broadcast 26 from broadcast at DAGScheduler.scala:1006 | |
16/05/11 13:01:28 INFO DAGScheduler: Submitting 200 missing tasks from ShuffleMapStage 31 (MapPartitionsRDD[111] at map at DrilldownArtist.scala:337) | |
16/05/11 13:01:28 INFO TaskSchedulerImpl: Adding task set 31.0 with 200 tasks | |
16/05/11 13:01:28 INFO TaskSetManager: Starting task 0.0 in stage 31.0 (TID 10961, localhost, partition 0,PROCESS_LOCAL, 1988 bytes) | |
16/05/11 13:01:28 INFO TaskSetManager: Starting task 1.0 in stage 31.0 (TID 10962, localhost, partition 1,PROCESS_LOCAL, 1988 bytes) | |
16/05/11 13:01:28 INFO TaskSetManager: Starting task 2.0 in stage 31.0 (TID 10963, localhost, partition 2,PROCESS_LOCAL, 1988 bytes) | |
16/05/11 13:01:28 INFO TaskSetManager: Starting task 3.0 in stage 31.0 (TID 10964, localhost, partition 3,PROCESS_LOCAL, 1988 bytes) | |
16/05/11 13:01:28 INFO TaskSetManager: Starting task 4.0 in stage 31.0 (TID 10965, localhost, partition 4,PROCESS_LOCAL, 1988 bytes) | |
16/05/11 13:01:28 INFO TaskSetManager: Starting task 5.0 in stage 31.0 (TID 10966, localhost, partition 5,PROCESS_LOCAL, 1988 bytes) | |
16/05/11 13:01:28 INFO TaskSetManager: Starting task 6.0 in stage 31.0 (TID 10967, localhost, partition 6,PROCESS_LOCAL, 1988 bytes) | |
16/05/11 13:01:28 INFO TaskSetManager: Starting task 7.0 in stage 31.0 (TID 10968, localhost, partition 7,PROCESS_LOCAL, 1988 bytes) | |
16/05/11 13:01:28 INFO Executor: Running task 3.0 in stage 31.0 (TID 10964) | |
16/05/11 13:01:28 INFO Executor: Running task 0.0 in stage 31.0 (TID 10961) | |
16/05/11 13:01:28 INFO Executor: Running task 5.0 in stage 31.0 (TID 10966) | |
16/05/11 13:01:28 INFO Executor: Running task 6.0 in stage 31.0 (TID 10967) | |
16/05/11 13:01:28 INFO Executor: Running task 2.0 in stage 31.0 (TID 10963) | |
16/05/11 13:01:28 INFO Executor: Running task 7.0 in stage 31.0 (TID 10968) | |
16/05/11 13:01:28 INFO Executor: Running task 1.0 in stage 31.0 (TID 10962) | |
16/05/11 13:01:28 INFO Executor: Running task 4.0 in stage 31.0 (TID 10965) | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Getting 759 non-empty blocks out of 1112 blocks | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Getting 759 non-empty blocks out of 1112 blocks | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Getting 759 non-empty blocks out of 1112 blocks | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Getting 759 non-empty blocks out of 1112 blocks | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Getting 759 non-empty blocks out of 1112 blocks | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Getting 759 non-empty blocks out of 1112 blocks | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Getting 759 non-empty blocks out of 1112 blocks | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms | |
16/05/11 13:01:28 ERROR ContextCleaner: Error in cleaning thread | |
java.lang.InterruptedException | |
at java.lang.Object.wait(Native Method) | |
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143) | |
at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:176) | |
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1180) | |
at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:173) | |
at org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:68) | |
16/05/11 13:01:28 ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:998) | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304) | |
at java.util.concurrent.Semaphore.acquire(Semaphore.java:312) | |
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(AsynchronousListenerBus.scala:66) | |
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(AsynchronousListenerBus.scala:65) | |
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(AsynchronousListenerBus.scala:65) | |
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) | |
at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:64) | |
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1180) | |
at org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63) | |
16/05/11 13:01:28 ERROR ShuffleBlockFetcherIterator: Error occurred while fetching local blocks | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireInterruptibly(AbstractQueuedSynchronizer.java:1220) | |
at java.util.concurrent.locks.ReentrantLock.lockInterruptibly(ReentrantLock.java:335) | |
at java.util.concurrent.LinkedBlockingQueue.put(LinkedBlockingQueue.java:339) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.fetchLocalBlocks(ShuffleBlockFetcherIterator.scala:242) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:269) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:112) | |
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:43) | |
at org.apache.spark.sql.execution.ShuffledRowRDD.compute(ShuffledRowRDD.scala:166) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) | |
at org.apache.spark.scheduler.Task.run(Task.scala:89) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
16/05/11 13:01:28 ERROR ShuffleBlockFetcherIterator: Error occurred while fetching local blocks | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireInterruptibly(AbstractQueuedSynchronizer.java:1220) | |
at java.util.concurrent.locks.ReentrantLock.lockInterruptibly(ReentrantLock.java:335) | |
at java.util.concurrent.LinkedBlockingQueue.put(LinkedBlockingQueue.java:339) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.fetchLocalBlocks(ShuffleBlockFetcherIterator.scala:242) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:269) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:112) | |
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:43) | |
at org.apache.spark.sql.execution.ShuffledRowRDD.compute(ShuffledRowRDD.scala:166) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) | |
at org.apache.spark.scheduler.Task.run(Task.scala:89) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
16/05/11 13:01:28 ERROR ShuffleBlockFetcherIterator: Error occurred while fetching local blocks | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireInterruptibly(AbstractQueuedSynchronizer.java:1220) | |
at java.util.concurrent.locks.ReentrantLock.lockInterruptibly(ReentrantLock.java:335) | |
at java.util.concurrent.LinkedBlockingQueue.put(LinkedBlockingQueue.java:339) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.fetchLocalBlocks(ShuffleBlockFetcherIterator.scala:242) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:269) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:112) | |
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:43) | |
at org.apache.spark.sql.execution.ShuffledRowRDD.compute(ShuffledRowRDD.scala:166) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) | |
at org.apache.spark.scheduler.Task.run(Task.scala:89) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
16/05/11 13:01:28 ERROR ShuffleBlockFetcherIterator: Error occurred while fetching local blocks | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireInterruptibly(AbstractQueuedSynchronizer.java:1220) | |
at java.util.concurrent.locks.ReentrantLock.lockInterruptibly(ReentrantLock.java:335) | |
at java.util.concurrent.LinkedBlockingQueue.put(LinkedBlockingQueue.java:339) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.fetchLocalBlocks(ShuffleBlockFetcherIterator.scala:242) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:269) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:112) | |
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:43) | |
at org.apache.spark.sql.execution.ShuffledRowRDD.compute(ShuffledRowRDD.scala:166) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) | |
at org.apache.spark.scheduler.Task.run(Task.scala:89) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
16/05/11 13:01:28 ERROR ShuffleBlockFetcherIterator: Error occurred while fetching local blocks | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireInterruptibly(AbstractQueuedSynchronizer.java:1220) | |
at java.util.concurrent.locks.ReentrantLock.lockInterruptibly(ReentrantLock.java:335) | |
at java.util.concurrent.LinkedBlockingQueue.put(LinkedBlockingQueue.java:339) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.fetchLocalBlocks(ShuffleBlockFetcherIterator.scala:242) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:269) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:112) | |
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:43) | |
at org.apache.spark.sql.execution.ShuffledRowRDD.compute(ShuffledRowRDD.scala:166) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) | |
at org.apache.spark.scheduler.Task.run(Task.scala:89) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
16/05/11 13:01:28 ERROR ShuffleBlockFetcherIterator: Error occurred while fetching local blocks | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireInterruptibly(AbstractQueuedSynchronizer.java:1220) | |
at java.util.concurrent.locks.ReentrantLock.lockInterruptibly(ReentrantLock.java:335) | |
at java.util.concurrent.LinkedBlockingQueue.put(LinkedBlockingQueue.java:339) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.fetchLocalBlocks(ShuffleBlockFetcherIterator.scala:242) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:269) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:112) | |
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:43) | |
at org.apache.spark.sql.execution.ShuffledRowRDD.compute(ShuffledRowRDD.scala:166) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) | |
at org.apache.spark.scheduler.Task.run(Task.scala:89) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
16/05/11 13:01:28 ERROR ShuffleBlockFetcherIterator: Error occurred while fetching local blocks | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireInterruptibly(AbstractQueuedSynchronizer.java:1220) | |
at java.util.concurrent.locks.ReentrantLock.lockInterruptibly(ReentrantLock.java:335) | |
at java.util.concurrent.LinkedBlockingQueue.put(LinkedBlockingQueue.java:339) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.fetchLocalBlocks(ShuffleBlockFetcherIterator.scala:242) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:269) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:112) | |
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:43) | |
at org.apache.spark.sql.execution.ShuffledRowRDD.compute(ShuffledRowRDD.scala:166) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) | |
at org.apache.spark.scheduler.Task.run(Task.scala:89) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
16/05/11 13:01:29 WARN TaskMemoryManager: leak 2.3 MB memory from org.apache.spark.unsafe.map.BytesToBytesMap@26c15a35 | |
16/05/11 13:01:29 WARN TaskMemoryManager: leak 2.3 MB memory from org.apache.spark.unsafe.map.BytesToBytesMap@3a3db5d5 | |
16/05/11 13:01:28 INFO ShuffleBlockFetcherIterator: Getting 759 non-empty blocks out of 1112 blocks | |
16/05/11 13:01:30 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 1677 ms | |
16/05/11 13:01:30 ERROR ShuffleBlockFetcherIterator: Error occurred while fetching local blocks | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireInterruptibly(AbstractQueuedSynchronizer.java:1220) | |
at java.util.concurrent.locks.ReentrantLock.lockInterruptibly(ReentrantLock.java:335) | |
at java.util.concurrent.LinkedBlockingQueue.put(LinkedBlockingQueue.java:339) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.fetchLocalBlocks(ShuffleBlockFetcherIterator.scala:242) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.initialize(ShuffleBlockFetcherIterator.scala:269) | |
at org.apache.spark.storage.ShuffleBlockFetcherIterator.<init>(ShuffleBlockFetcherIterator.scala:112) | |
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:43) | |
at org.apache.spark.sql.execution.ShuffledRowRDD.compute(ShuffledRowRDD.scala:166) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73) | |
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) | |
at org.apache.spark.scheduler.Task.run(Task.scala:89) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
16/05/11 13:01:30 INFO SparkUI: Stopped Spark web UI at http://128.131.193.128:4040 | |
16/05/11 13:01:30 ERROR Executor: Managed memory leak detected; size = 2359296 bytes, TID = 10967 | |
16/05/11 13:01:30 WARN TaskMemoryManager: leak 2.3 MB memory from org.apache.spark.unsafe.map.BytesToBytesMap@117ed4df | |
16/05/11 13:01:30 WARN TaskMemoryManager: leak 2.3 MB memory from org.apache.spark.unsafe.map.BytesToBytesMap@751848e | |
16/05/11 13:01:30 ERROR Executor: Managed memory leak detected; size = 2359296 bytes, TID = 10966 | |
16/05/11 13:01:30 ERROR Executor: Managed memory leak detected; size = 2359296 bytes, TID = 10965 | |
16/05/11 13:01:30 ERROR Executor: Managed memory leak detected; size = 2359296 bytes, TID = 10963 | |
16/05/11 13:01:30 WARN TaskMemoryManager: leak 2.3 MB memory from org.apache.spark.unsafe.map.BytesToBytesMap@290d290e | |
16/05/11 13:01:30 ERROR Executor: Managed memory leak detected; size = 2359296 bytes, TID = 10964 | |
java.lang.RuntimeException: Nonzero exit code: 1 | |
at scala.sys.package$.error(package.scala:27) | |
[trace] Stack trace suppressed: run last compile:run for the full output. | |
16/05/11 13:01:30 WARN TaskMemoryManager: leak 2.3 MB memory from org.apache.spark.unsafe.map.BytesToBytesMap@3c54f229 | |
16/05/11 13:01:30 ERROR Executor: Managed memory leak detected; size = 2359296 bytes, TID = 10968 | |
[error] (compile:run) Nonzero exit code: 1 | |
[error] Total time: 501 s, completed 11.05.2016 13:01:30 | |
16/05/11 13:01:30 INFO DiskBlockManager: Shutdown hook called |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment