Created
November 8, 2017 05:21
-
-
Save anonymous/27f668442254b12db098a5605b7daf7d to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
17/11/08 12:17:19 INFO spark.SparkContext: Running Spark version 2.2.0 | |
17/11/08 12:17:19 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
17/11/08 12:17:19 WARN util.Utils: Your hostname, CPU11453 resolves to a loopback address: 127.0.1.1; using 10.199.220.51 instead (on interface eno1) | |
17/11/08 12:17:19 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address | |
17/11/08 12:17:19 INFO spark.SparkContext: Submitted application: Simple Application | |
17/11/08 12:17:19 INFO spark.SecurityManager: Changing view acls to: cpu11453local | |
17/11/08 12:17:19 INFO spark.SecurityManager: Changing modify acls to: cpu11453local | |
17/11/08 12:17:19 INFO spark.SecurityManager: Changing view acls groups to: | |
17/11/08 12:17:19 INFO spark.SecurityManager: Changing modify acls groups to: | |
17/11/08 12:17:19 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(cpu11453local); groups with view permissions: Set(); users with modify permissions: Set(cpu11453local); groups with modify permissions: Set() | |
17/11/08 12:17:19 INFO util.Utils: Successfully started service 'sparkDriver' on port 40749. | |
17/11/08 12:17:19 INFO spark.SparkEnv: Registering MapOutputTracker | |
17/11/08 12:17:19 INFO spark.SparkEnv: Registering BlockManagerMaster | |
17/11/08 12:17:19 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
17/11/08 12:17:19 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up | |
17/11/08 12:17:19 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-4e3e7d58-365c-43c4-9aaf-0c2750859e0b | |
17/11/08 12:17:19 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB | |
17/11/08 12:17:19 INFO spark.SparkEnv: Registering OutputCommitCoordinator | |
17/11/08 12:17:19 INFO util.log: Logging initialized @1269ms | |
17/11/08 12:17:20 INFO server.Server: jetty-9.3.z-SNAPSHOT | |
17/11/08 12:17:20 INFO server.Server: Started @1324ms | |
17/11/08 12:17:20 INFO server.AbstractConnector: Started ServerConnector@672f11c2{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} | |
17/11/08 12:17:20 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@69adf72c{/jobs,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1c25b8a7{/jobs/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@750fe12e{/jobs/job,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2ef8a8c3{/jobs/job/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63fd4873{/stages,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7544a1e4{/stages/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7957dc72{/stages/stage,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@82c57b3{/stages/stage/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@600b0b7{/stages/pool,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5ea502e0{/stages/pool/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@473b3b7a{/storage,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@77b7ffa4{/storage/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@402f80f5{/storage/rdd,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@133e019b{/storage/rdd/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7dac3fd8{/environment,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2102a4d5{/environment/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d4d3fe7{/executors,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51684e4a{/executors/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@38875e7d{/executors/threadDump,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@d816dde{/executors/threadDump/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6c451c9c{/static,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6cc0bcf6{/,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@32f61a31{/api,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49a64d82{/jobs/job/kill,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66d23e4a{/stages/stage/kill,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.199.220.51:4040 | |
17/11/08 12:17:20 INFO spark.SparkContext: Added JAR file:/home/cpu11453local/workspace/testspark_scala/target/scala-2.12/simple-project_2.12-1.0.jar at spark://10.199.220.51:40749/jars/simple-project_2.12-1.0.jar with timestamp 1510118240125 | |
17/11/08 12:17:20 INFO executor.Executor: Starting executor ID driver on host localhost | |
17/11/08 12:17:20 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 45313. | |
17/11/08 12:17:20 INFO netty.NettyBlockTransferService: Server created on 10.199.220.51:45313 | |
17/11/08 12:17:20 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
17/11/08 12:17:20 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.199.220.51, 45313, None) | |
17/11/08 12:17:20 INFO storage.BlockManagerMasterEndpoint: Registering block manager 10.199.220.51:45313 with 366.3 MB RAM, BlockManagerId(driver, 10.199.220.51, 45313, None) | |
17/11/08 12:17:20 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.199.220.51, 45313, None) | |
17/11/08 12:17:20 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.199.220.51, 45313, None) | |
17/11/08 12:17:20 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@726a6b94{/metrics/json,null,AVAILABLE,@Spark} | |
17/11/08 12:17:20 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 237.1 KB, free 366.1 MB) | |
17/11/08 12:17:20 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 23.0 KB, free 366.0 MB) | |
17/11/08 12:17:20 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.199.220.51:45313 (size: 23.0 KB, free: 366.3 MB) | |
17/11/08 12:17:20 INFO spark.SparkContext: Created broadcast 0 from textFile at SimpleApp.scala:11 | |
17/11/08 12:17:20 WARN util.ClosureCleaner: Expected a closure; got SimpleApp$$$Lambda$14/1715189999 | |
17/11/08 12:17:21 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
17/11/08 12:17:21 INFO spark.SparkContext: Starting job: count at SimpleApp.scala:12 | |
17/11/08 12:17:21 INFO scheduler.DAGScheduler: Got job 0 (count at SimpleApp.scala:12) with 2 output partitions | |
17/11/08 12:17:21 INFO scheduler.DAGScheduler: Final stage: ResultStage 0 (count at SimpleApp.scala:12) | |
17/11/08 12:17:21 INFO scheduler.DAGScheduler: Parents of final stage: List() | |
17/11/08 12:17:21 INFO scheduler.DAGScheduler: Missing parents: List() | |
17/11/08 12:17:21 INFO scheduler.DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[2] at filter at SimpleApp.scala:12), which has no missing parents | |
17/11/08 12:17:21 INFO memory.MemoryStore: Block broadcast_1 stored as values in memory (estimated size 3.8 KB, free 366.0 MB) | |
17/11/08 12:17:21 INFO memory.MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.2 KB, free 366.0 MB) | |
17/11/08 12:17:21 INFO storage.BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.199.220.51:45313 (size: 2.2 KB, free: 366.3 MB) | |
17/11/08 12:17:21 INFO spark.SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1006 | |
17/11/08 12:17:21 INFO scheduler.DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[2] at filter at SimpleApp.scala:12) (first 15 tasks are for partitions Vector(0, 1)) | |
17/11/08 12:17:21 INFO scheduler.TaskSchedulerImpl: Adding task set 0.0 with 2 tasks | |
17/11/08 12:17:21 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, ANY, 4855 bytes) | |
17/11/08 12:17:21 INFO scheduler.TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, localhost, executor driver, partition 1, ANY, 4855 bytes) | |
17/11/08 12:17:21 INFO executor.Executor: Running task 0.0 in stage 0.0 (TID 0) | |
17/11/08 12:17:21 INFO executor.Executor: Running task 1.0 in stage 0.0 (TID 1) | |
17/11/08 12:17:21 INFO executor.Executor: Fetching spark://10.199.220.51:40749/jars/simple-project_2.12-1.0.jar with timestamp 1510118240125 | |
17/11/08 12:17:21 INFO client.TransportClientFactory: Successfully created connection to /10.199.220.51:40749 after 17 ms (0 ms spent in bootstraps) | |
17/11/08 12:17:21 INFO util.Utils: Fetching spark://10.199.220.51:40749/jars/simple-project_2.12-1.0.jar to /tmp/spark-7b7489a6-acb0-4c5f-8e2a-5edddf44e444/userFiles-f2bf62b7-513b-4de9-9295-3c515efa4383/fetchFileTemp8756367481541820407.tmp | |
17/11/08 12:17:21 INFO executor.Executor: Adding file:/tmp/spark-7b7489a6-acb0-4c5f-8e2a-5edddf44e444/userFiles-f2bf62b7-513b-4de9-9295-3c515efa4383/simple-project_2.12-1.0.jar to class loader | |
17/11/08 12:17:21 ERROR executor.Executor: Exception in task 0.0 in stage 0.0 (TID 0) | |
java.io.IOException: unexpected exception type | |
at java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1582) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1154) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2036) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422) | |
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75) | |
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80) | |
at org.apache.spark.scheduler.Task.run(Task.scala:108) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.lang.reflect.InvocationTargetException | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:230) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1148) | |
... 23 more | |
Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
at SimpleApp$.$deserializeLambda$(SimpleApp.scala) | |
... 33 more | |
Caused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
... 34 more | |
17/11/08 12:17:21 ERROR executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1) | |
java.io.IOException: unexpected exception type | |
at java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1582) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1154) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2036) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422) | |
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75) | |
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80) | |
at org.apache.spark.scheduler.Task.run(Task.scala:108) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.lang.reflect.InvocationTargetException | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:230) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1148) | |
... 23 more | |
Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
at SimpleApp$.$deserializeLambda$(SimpleApp.scala) | |
... 33 more | |
Caused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
... 34 more | |
Caused by: java.lang.ClassNotFoundException: scala.runtime.LambdaDeserialize | |
at java.net.URLClassLoader.findClass(URLClassLoader.java:381) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:424) | |
at java.lang.ClassLoader.loadClass(ClassLoader.java:357) | |
... 34 more | |
17/11/08 12:17:21 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: unexpected exception type | |
at java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1582) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1154) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2036) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422) | |
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75) | |
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80) | |
at org.apache.spark.scheduler.Task.run(Task.scala:108) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.lang.reflect.InvocationTargetException | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:230) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1148) | |
... 23 more | |
Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
at SimpleApp$.$deserializeLambda$(SimpleApp.scala) | |
... 33 more | |
Caused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
... 34 more | |
17/11/08 12:17:21 ERROR scheduler.TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job | |
17/11/08 12:17:21 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool | |
17/11/08 12:17:21 INFO scheduler.TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1) on localhost, executor driver: java.io.IOException (unexpected exception type) [duplicate 1] | |
17/11/08 12:17:21 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool | |
17/11/08 12:17:21 INFO scheduler.TaskSchedulerImpl: Cancelling stage 0 | |
17/11/08 12:17:21 INFO scheduler.DAGScheduler: ResultStage 0 (count at SimpleApp.scala:12) failed in 0.189 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: unexpected exception type | |
at java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1582) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1154) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2036) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422) | |
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75) | |
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80) | |
at org.apache.spark.scheduler.Task.run(Task.scala:108) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.lang.reflect.InvocationTargetException | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:230) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1148) | |
... 23 more | |
Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
at SimpleApp$.$deserializeLambda$(SimpleApp.scala) | |
... 33 more | |
Caused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
... 34 more | |
Driver stacktrace: | |
17/11/08 12:17:21 INFO scheduler.DAGScheduler: Job 0 failed: count at SimpleApp.scala:12, took 0.283017 s | |
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.io.IOException: unexpected exception type | |
at java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1582) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1154) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2036) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422) | |
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75) | |
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80) | |
at org.apache.spark.scheduler.Task.run(Task.scala:108) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.lang.reflect.InvocationTargetException | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:230) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1148) | |
... 23 more | |
Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
at SimpleApp$.$deserializeLambda$(SimpleApp.scala) | |
... 33 more | |
Caused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
... 34 more | |
Driver stacktrace: | |
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1499) | |
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1487) | |
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1486) | |
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) | |
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) | |
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1486) | |
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814) | |
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814) | |
at scala.Option.foreach(Option.scala:257) | |
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814) | |
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1714) | |
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1669) | |
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1658) | |
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) | |
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2062) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2087) | |
at org.apache.spark.rdd.RDD.count(RDD.scala:1158) | |
at SimpleApp$.main(SimpleApp.scala:12) | |
at SimpleApp.main(SimpleApp.scala) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755) | |
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.io.IOException: unexpected exception type | |
at java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1582) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1154) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2036) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245) | |
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169) | |
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027) | |
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) | |
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422) | |
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75) | |
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80) | |
at org.apache.spark.scheduler.Task.run(Task.scala:108) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:748) | |
Caused by: java.lang.reflect.InvocationTargetException | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:230) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1148) | |
... 23 more | |
Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
at SimpleApp$.$deserializeLambda$(SimpleApp.scala) | |
... 33 more | |
Caused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize | |
... 34 more | |
17/11/08 12:17:21 INFO spark.SparkContext: Invoking stop() from shutdown hook | |
17/11/08 12:17:21 INFO server.AbstractConnector: Stopped Spark@672f11c2{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} | |
17/11/08 12:17:21 INFO ui.SparkUI: Stopped Spark web UI at http://10.199.220.51:4040 | |
17/11/08 12:17:21 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! | |
17/11/08 12:17:21 INFO memory.MemoryStore: MemoryStore cleared | |
17/11/08 12:17:21 INFO storage.BlockManager: BlockManager stopped | |
17/11/08 12:17:21 INFO storage.BlockManagerMaster: BlockManagerMaster stopped | |
17/11/08 12:17:21 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! | |
17/11/08 12:17:21 INFO spark.SparkContext: Successfully stopped SparkContext | |
17/11/08 12:17:21 INFO util.ShutdownHookManager: Shutdown hook called | |
17/11/08 12:17:21 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7b7489a6-acb0-4c5f-8e2a-5edddf44e444 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment