Skip to content

Instantly share code, notes, and snippets.

@Aslan
Created October 23, 2015 23:15
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Aslan/dcb2fd1933c857519d86 to your computer and use it in GitHub Desktop.
Save Aslan/dcb2fd1933c857519d86 to your computer and use it in GitHub Desktop.
executing pwd:
/home/hadoop
executing /usr/bin/spark-submit --class com.truex.prometheus.CLIJob Prometheus-assembly-0.0.1.jar -e 'select x.id, x.title, x.description, x.mediaavailableDate as available_date, x.mediaexpirationDate as expiration_date, mediacategories.medianame as media_name, x.mediakeywords as keywords, mediaratings.scheme as rating_scheme, mediaratings.rating, cast(mediaratings.subRatings as String) as sub_ratings, content.plfileduration as duration, x.plmediaprovider as provider, x.ngccontentAdType as ad_type, x.ngcepisodeNumber as episode, ngcnetwork as network, x.ngcseasonNumber as season_number, x.ngcuID as ngc_uid, x.ngcvideoType as video_type from etl lateral view explode(entries) entries as x lateral view explode(x.mediacategories) cat as mediacategories lateral view explode(x.mediaratings) r as mediaratings lateral view explode(x.mediacontent) mediacontent as content lateral view outer explode(x.ngcnetwork) net as ngcnetworkr' -j http://feed.theplatform.com/f/ngc/ngcngw-analytics /tmp/test:
ERROR: The execution of command -/usr/bin/spark-submit --class com.truex.prometheus.CLIJob Prometheus-assembly-0.0.1.jar -e 'select x.id, x.title, x.description, x.mediaavailableDate as available_date, x.mediaexpirationDate as expiration_date, mediacategories.medianame as media_name, x.mediakeywords as keywords, mediaratings.scheme as rating_scheme, mediaratings.rating, cast(mediaratings.subRatings as String) as sub_ratings, content.plfileduration as duration, x.plmediaprovider as provider, x.ngccontentAdType as ad_type, x.ngcepisodeNumber as episode, ngcnetwork as network, x.ngcseasonNumber as season_number, x.ngcuID as ngc_uid, x.ngcvideoType as video_type from etl lateral view explode(entries) entries as x lateral view explode(x.mediacategories) cat as mediacategories lateral view explode(x.mediaratings) r as mediaratings lateral view explode(x.mediacontent) mediacontent as content lateral view outer explode(x.ngcnetwork) net as ngcnetworkr' -j http://feed.theplatform.com/f/ngc/ngcngw-analytics /tmp/test on remote host failed with exit code 1.
STDERR: 15/10/23 23:07:21 INFO SparkContext: Running Spark version 1.5.0
15/10/23 23:07:21 INFO SecurityManager: Changing view acls to: hadoop
15/10/23 23:07:21 INFO SecurityManager: Changing modify acls to: hadoop
15/10/23 23:07:21 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
15/10/23 23:07:22 INFO Slf4jLogger: Slf4jLogger started
15/10/23 23:07:22 INFO Remoting: Starting remoting
15/10/23 23:07:23 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@10.65.200.150:44966]
15/10/23 23:07:23 INFO Utils: Successfully started service 'sparkDriver' on port 44966.
15/10/23 23:07:23 INFO SparkEnv: Registering MapOutputTracker
15/10/23 23:07:23 INFO SparkEnv: Registering BlockManagerMaster
15/10/23 23:07:23 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-aa6f1e74-0d4e-49cc-b88f-de05da0af6eb
15/10/23 23:07:23 INFO MemoryStore: MemoryStore started with capacity 535.0 MB
15/10/23 23:07:23 INFO HttpFileServer: HTTP File server directory is /tmp/spark-e8264f61-0856-4b33-9aeb-0476a68ee141/httpd-3353a581-8e9f-43be-86b5-8f335ff2163d
15/10/23 23:07:23 INFO HttpServer: Starting HTTP Server
15/10/23 23:07:23 INFO Utils: Successfully started service 'HTTP file server' on port 54242.
15/10/23 23:07:23 INFO SparkEnv: Registering OutputCommitCoordinator
15/10/23 23:07:24 INFO Utils: Successfully started service 'SparkUI' on port 4040.
15/10/23 23:07:24 INFO SparkUI: Started SparkUI at http://10.65.200.150:4040
15/10/23 23:07:24 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set.
15/10/23 23:07:24 INFO RMProxy: Connecting to ResourceManager at ip-10-65-200-150.ec2.internal/10.65.200.150:8032
15/10/23 23:07:24 INFO Client: Requesting a new application from cluster with 2 NodeManagers
15/10/23 23:07:24 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (11520 MB per container)
15/10/23 23:07:24 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead
15/10/23 23:07:24 INFO Client: Setting up container launch context for our AM
15/10/23 23:07:24 INFO Client: Setting up the launch environment for our AM container
15/10/23 23:07:24 INFO Client: Preparing resources for our AM container
15/10/23 23:07:25 INFO Client: Uploading resource file:/usr/lib/spark/lib/spark-assembly-1.5.0-hadoop2.6.0-amzn-1.jar -> hdfs://ip-10-65-200-150.ec2.internal:8020/user/hadoop/.sparkStaging/application_1444274555723_0046/spark-assembly-1.5.0-hadoop2.6.0-amzn-1.jar
15/10/23 23:07:25 INFO MetricsSaver: MetricsConfigRecord disabledInCluster: false instanceEngineCycleSec: 60 clusterEngineCycleSec: 60 disableClusterEngine: false maxMemoryMb: 3072 maxInstanceCount: 500 lastModified: 1444274560440
15/10/23 23:07:25 INFO MetricsSaver: Created MetricsSaver j-2US4HNPLS1SJO:i-131cdec7:SparkSubmit:26003 period:60 /mnt/var/em/raw/i-131cdec7_20151023_SparkSubmit_26003_raw.bin
15/10/23 23:07:26 INFO MetricsSaver: 1 aggregated HDFSWriteDelay 64 raw values into 1 aggregated values, total 1
15/10/23 23:07:27 INFO Client: Uploading resource file:/tmp/spark-e8264f61-0856-4b33-9aeb-0476a68ee141/__spark_conf__2670166078113187911.zip -> hdfs://ip-10-65-200-150.ec2.internal:8020/user/hadoop/.sparkStaging/application_1444274555723_0046/__spark_conf__2670166078113187911.zip
15/10/23 23:07:27 INFO SecurityManager: Changing view acls to: hadoop
15/10/23 23:07:27 INFO SecurityManager: Changing modify acls to: hadoop
15/10/23 23:07:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); users with modify permissions: Set(hadoop)
15/10/23 23:07:27 INFO Client: Submitting application 46 to ResourceManager
15/10/23 23:07:27 INFO YarnClientImpl: Submitted application application_1444274555723_0046
15/10/23 23:07:28 INFO Client: Application report for application_1444274555723_0046 (state: ACCEPTED)
15/10/23 23:07:28 INFO Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1445641647868
final status: UNDEFINED
tracking URL: http://ip-10-65-200-150.ec2.internal:20888/proxy/application_1444274555723_0046/
user: hadoop
15/10/23 23:07:29 INFO Client: Application report for application_1444274555723_0046 (state: ACCEPTED)
15/10/23 23:07:30 INFO Client: Application report for application_1444274555723_0046 (state: ACCEPTED)
15/10/23 23:07:31 INFO Client: Application report for application_1444274555723_0046 (state: ACCEPTED)
15/10/23 23:07:32 INFO Client: Application report for application_1444274555723_0046 (state: ACCEPTED)
15/10/23 23:07:33 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as AkkaRpcEndpointRef(Actor[akka.tcp://sparkYarnAM@10.67.169.247:44761/user/YarnAM#1964303202])
15/10/23 23:07:33 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> ip-10-65-200-150.ec2.internal, PROXY_URI_BASES -> http://ip-10-65-200-150.ec2.internal:20888/proxy/application_1444274555723_0046), /proxy/application_1444274555723_0046
15/10/23 23:07:33 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter
15/10/23 23:07:33 INFO Client: Application report for application_1444274555723_0046 (state: RUNNING)
15/10/23 23:07:33 INFO Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: 10.67.169.247
ApplicationMaster RPC port: 0
queue: default
start time: 1445641647868
final status: UNDEFINED
tracking URL: http://ip-10-65-200-150.ec2.internal:20888/proxy/application_1444274555723_0046/
user: hadoop
15/10/23 23:07:33 INFO YarnClientSchedulerBackend: Application application_1444274555723_0046 has started running.
15/10/23 23:07:34 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 55370.
15/10/23 23:07:34 INFO NettyBlockTransferService: Server created on 55370
15/10/23 23:07:34 INFO BlockManagerMaster: Trying to register BlockManager
15/10/23 23:07:34 INFO BlockManagerMasterEndpoint: Registering block manager 10.65.200.150:55370 with 535.0 MB RAM, BlockManagerId(driver, 10.65.200.150, 55370)
15/10/23 23:07:34 INFO BlockManagerMaster: Registered BlockManager
15/10/23 23:07:34 INFO EventLoggingListener: Logging events to hdfs:///var/log/spark/apps/application_1444274555723_0046
15/10/23 23:07:38 INFO YarnClientSchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@ip-10-67-169-247.ec2.internal:34087/user/Executor#-1495293751]) with ID 2
15/10/23 23:07:38 INFO BlockManagerMasterEndpoint: Registering block manager ip-10-67-169-247.ec2.internal:54724 with 535.0 MB RAM, BlockManagerId(2, ip-10-67-169-247.ec2.internal, 54724)
15/10/23 23:07:39 INFO YarnClientSchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@ip-10-169-170-124.ec2.internal:35718/user/Executor#-1940003871]) with ID 1
15/10/23 23:07:39 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8
15/10/23 23:07:39 INFO BlockManagerMasterEndpoint: Registering block manager ip-10-169-170-124.ec2.internal:56923 with 535.0 MB RAM, BlockManagerId(1, ip-10-169-170-124.ec2.internal, 56923)
15/10/23 23:07:40 INFO HiveContext: Initializing execution hive, version 1.2.1
15/10/23 23:07:40 INFO ClientWrapper: Inspected Hadoop version: 2.6.0-amzn-1
15/10/23 23:07:40 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.6.0-amzn-1
15/10/23 23:07:40 INFO metastore: Trying to connect to metastore with URI thrift://ip-10-65-200-150.ec2.internal:9083
15/10/23 23:07:40 INFO metastore: Connected to metastore.
15/10/23 23:07:40 INFO SessionState: Created local directory: /tmp/9ecdac73-f007-469a-adab-63254bedeb3b_resources
15/10/23 23:07:40 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/9ecdac73-f007-469a-adab-63254bedeb3b
15/10/23 23:07:40 INFO SessionState: Created local directory: /tmp/hadoop/9ecdac73-f007-469a-adab-63254bedeb3b
15/10/23 23:07:40 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/9ecdac73-f007-469a-adab-63254bedeb3b/_tmp_space.db
15/10/23 23:07:40 INFO HiveContext: default warehouse location is /user/hive/warehouse
15/10/23 23:07:40 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
15/10/23 23:07:40 INFO ClientWrapper: Inspected Hadoop version: 2.4.0
15/10/23 23:07:40 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.4.0
15/10/23 23:07:41 WARN Configuration: mapred-site.xml:an attempt to override final parameter: mapreduce.cluster.local.dir; Ignoring.
15/10/23 23:07:41 WARN Configuration: mapred-site.xml:an attempt to override final parameter: mapreduce.cluster.local.dir; Ignoring.
15/10/23 23:07:41 WARN Configuration: mapred-site.xml:an attempt to override final parameter: mapreduce.cluster.local.dir; Ignoring.
15/10/23 23:07:41 WARN Configuration: mapred-site.xml:an attempt to override final parameter: mapreduce.cluster.local.dir; Ignoring.
15/10/23 23:07:41 WARN Configuration: mapred-site.xml:an attempt to override final parameter: mapreduce.cluster.local.dir; Ignoring.
15/10/23 23:07:41 WARN Configuration: mapred-site.xml:an attempt to override final parameter: mapreduce.cluster.local.dir; Ignoring.
15/10/23 23:07:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/10/23 23:07:41 INFO metastore: Trying to connect to metastore with URI thrift://ip-10-65-200-150.ec2.internal:9083
15/10/23 23:07:41 INFO metastore: Connected to metastore.
15/10/23 23:07:41 WARN Configuration: mapred-site.xml:an attempt to override final parameter: mapreduce.cluster.local.dir; Ignoring.
15/10/23 23:07:42 INFO SessionState: Created local directory: /tmp/76771d53-4c21-44d5-8286-9289c8d1377e_resources
15/10/23 23:07:42 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/76771d53-4c21-44d5-8286-9289c8d1377e
15/10/23 23:07:42 INFO SessionState: Created local directory: /tmp/hadoop/76771d53-4c21-44d5-8286-9289c8d1377e
15/10/23 23:07:42 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/76771d53-4c21-44d5-8286-9289c8d1377e/_tmp_space.db
15/10/23 23:07:48 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id
15/10/23 23:07:48 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id
15/10/23 23:07:48 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap
15/10/23 23:07:48 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition
15/10/23 23:07:48 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id
15/10/23 23:07:48 INFO SparkContext: Starting job: saveAsTextFile at CLIJob.scala:96
15/10/23 23:07:48 INFO DAGScheduler: Got job 0 (saveAsTextFile at CLIJob.scala:96) with 2 output partitions
15/10/23 23:07:48 INFO DAGScheduler: Final stage: ResultStage 0(saveAsTextFile at CLIJob.scala:96)
15/10/23 23:07:48 INFO DAGScheduler: Parents of final stage: List()
15/10/23 23:07:48 INFO DAGScheduler: Missing parents: List()
15/10/23 23:07:48 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at saveAsTextFile at CLIJob.scala:96), which has no missing parents
15/10/23 23:07:49 INFO MemoryStore: ensureFreeSpace(136240) called with curMem=0, maxMem=560993402
15/10/23 23:07:49 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 133.0 KB, free 534.9 MB)
15/10/23 23:07:49 INFO MemoryStore: ensureFreeSpace(47302) called with curMem=136240, maxMem=560993402
15/10/23 23:07:49 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 46.2 KB, free 534.8 MB)
15/10/23 23:07:49 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.65.200.150:55370 (size: 46.2 KB, free: 535.0 MB)
15/10/23 23:07:49 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:861
15/10/23 23:07:49 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at saveAsTextFile at CLIJob.scala:96)
15/10/23 23:07:49 INFO YarnScheduler: Adding task set 0.0 with 2 tasks
15/10/23 23:07:49 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, ip-10-67-169-247.ec2.internal, PROCESS_LOCAL, 2087 bytes)
15/10/23 23:07:49 WARN TaskSetManager: Stage 0 contains a task of very large size (188 KB). The maximum recommended task size is 100 KB.
15/10/23 23:07:49 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, ip-10-169-170-124.ec2.internal, PROCESS_LOCAL, 193249 bytes)
15/10/23 23:07:49 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on ip-10-67-169-247.ec2.internal:54724 (size: 46.2 KB, free: 535.0 MB)
15/10/23 23:07:49 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on ip-10-169-170-124.ec2.internal:56923 (size: 46.2 KB, free: 535.0 MB)
15/10/23 23:07:50 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1589 ms on ip-10-67-169-247.ec2.internal (1/2)
15/10/23 23:07:51 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1892 ms on ip-10-169-170-124.ec2.internal (2/2)
15/10/23 23:07:51 INFO DAGScheduler: ResultStage 0 (saveAsTextFile at CLIJob.scala:96) finished in 1.925 s
15/10/23 23:07:51 INFO YarnScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool
15/10/23 23:07:51 INFO DAGScheduler: Job 0 finished: saveAsTextFile at CLIJob.scala:96, took 2.420888 s
15/10/23 23:07:51 INFO SparkContext: Starting job: json at CLIJob.scala:104
15/10/23 23:07:51 INFO DAGScheduler: Got job 1 (json at CLIJob.scala:104) with 2 output partitions
15/10/23 23:07:51 INFO DAGScheduler: Final stage: ResultStage 1(json at CLIJob.scala:104)
15/10/23 23:07:51 INFO DAGScheduler: Parents of final stage: List()
15/10/23 23:07:51 INFO DAGScheduler: Missing parents: List()
15/10/23 23:07:51 INFO DAGScheduler: Submitting ResultStage 1 (MapPartitionsRDD[5] at json at CLIJob.scala:104), which has no missing parents
15/10/23 23:07:51 INFO MemoryStore: ensureFreeSpace(3776) called with curMem=183542, maxMem=560993402
15/10/23 23:07:51 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 3.7 KB, free 534.8 MB)
15/10/23 23:07:51 INFO MemoryStore: ensureFreeSpace(2070) called with curMem=187318, maxMem=560993402
15/10/23 23:07:51 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 2.0 KB, free 534.8 MB)
15/10/23 23:07:51 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on 10.65.200.150:55370 (size: 2.0 KB, free: 535.0 MB)
15/10/23 23:07:51 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:861
15/10/23 23:07:51 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 1 (MapPartitionsRDD[5] at json at CLIJob.scala:104)
15/10/23 23:07:51 INFO YarnScheduler: Adding task set 1.0 with 2 tasks
15/10/23 23:07:51 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 2, ip-10-169-170-124.ec2.internal, PROCESS_LOCAL, 2087 bytes)
15/10/23 23:07:51 WARN TaskSetManager: Stage 1 contains a task of very large size (188 KB). The maximum recommended task size is 100 KB.
15/10/23 23:07:51 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 3, ip-10-67-169-247.ec2.internal, PROCESS_LOCAL, 193249 bytes)
15/10/23 23:07:51 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on ip-10-169-170-124.ec2.internal:56923 (size: 2.0 KB, free: 535.0 MB)
15/10/23 23:07:51 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on ip-10-67-169-247.ec2.internal:54724 (size: 2.0 KB, free: 535.0 MB)
15/10/23 23:07:51 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 2, ip-10-169-170-124.ec2.internal): java.lang.ClassNotFoundException: com.truex.prometheus.RDDExtension$ExtendedStringRDD$$anonfun$sanitizeJSON$1
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:88)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
15/10/23 23:07:51 INFO TaskSetManager: Starting task 0.1 in stage 1.0 (TID 4, ip-10-169-170-124.ec2.internal, PROCESS_LOCAL, 2087 bytes)
15/10/23 23:07:51 INFO TaskSetManager: Lost task 1.0 in stage 1.0 (TID 3) on executor ip-10-67-169-247.ec2.internal: java.lang.ClassNotFoundException (com.truex.prometheus.RDDExtension$ExtendedStringRDD$$anonfun$sanitizeJSON$1) [duplicate 1]
15/10/23 23:07:51 INFO TaskSetManager: Starting task 1.1 in stage 1.0 (TID 5, ip-10-67-169-247.ec2.internal, PROCESS_LOCAL, 193249 bytes)
15/10/23 23:07:51 INFO TaskSetManager: Lost task 0.1 in stage 1.0 (TID 4) on executor ip-10-169-170-124.ec2.internal: java.lang.ClassNotFoundException (com.truex.prometheus.RDDExtension$ExtendedStringRDD$$anonfun$sanitizeJSON$1) [duplicate 2]
15/10/23 23:07:51 INFO TaskSetManager: Starting task 0.2 in stage 1.0 (TID 6, ip-10-169-170-124.ec2.internal, PROCESS_LOCAL, 2087 bytes)
15/10/23 23:07:51 INFO TaskSetManager: Lost task 0.2 in stage 1.0 (TID 6) on executor ip-10-169-170-124.ec2.internal: java.lang.ClassNotFoundException (com.truex.prometheus.RDDExtension$ExtendedStringRDD$$anonfun$sanitizeJSON$1) [duplicate 3]
15/10/23 23:07:51 INFO TaskSetManager: Starting task 0.3 in stage 1.0 (TID 7, ip-10-169-170-124.ec2.internal, PROCESS_LOCAL, 2087 bytes)
15/10/23 23:07:51 INFO TaskSetManager: Lost task 1.1 in stage 1.0 (TID 5) on executor ip-10-67-169-247.ec2.internal: java.lang.ClassNotFoundException (com.truex.prometheus.RDDExtension$ExtendedStringRDD$$anonfun$sanitizeJSON$1) [duplicate 4]
15/10/23 23:07:51 INFO TaskSetManager: Starting task 1.2 in stage 1.0 (TID 8, ip-10-67-169-247.ec2.internal, PROCESS_LOCAL, 193249 bytes)
15/10/23 23:07:51 INFO TaskSetManager: Lost task 0.3 in stage 1.0 (TID 7) on executor ip-10-169-170-124.ec2.internal: java.lang.ClassNotFoundException (com.truex.prometheus.RDDExtension$ExtendedStringRDD$$anonfun$sanitizeJSON$1) [duplicate 5]
15/10/23 23:07:51 ERROR TaskSetManager: Task 0 in stage 1.0 failed 4 times; aborting job
15/10/23 23:07:51 INFO YarnScheduler: Cancelling stage 1
15/10/23 23:07:51 INFO YarnScheduler: Stage 1 was cancelled
15/10/23 23:07:51 INFO TaskSetManager: Lost task 1.2 in stage 1.0 (TID 8) on executor ip-10-67-169-247.ec2.internal: java.lang.ClassNotFoundException (com.truex.prometheus.RDDExtension$ExtendedStringRDD$$anonfun$sanitizeJSON$1) [duplicate 6]
15/10/23 23:07:51 INFO YarnScheduler: Removed TaskSet 1.0, whose tasks have all completed, from pool
15/10/23 23:07:51 INFO DAGScheduler: ResultStage 1 (json at CLIJob.scala:104) failed in 0.304 s
15/10/23 23:07:51 INFO DAGScheduler: Job 1 failed: json at CLIJob.scala:104, took 0.331965 s
Exception in thread "main" org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 7, ip-10-169-170-124.ec2.internal): java.lang.ClassNotFoundException: com.truex.prometheus.RDDExtension$ExtendedStringRDD$$anonfun$sanitizeJSON$1
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:88)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1280)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1268)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1267)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1267)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1493)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1455)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1444)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1813)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1933)
at org.apache.spark.rdd.RDD$$anonfun$reduce$1.apply(RDD.scala:1003)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
at org.apache.spark.rdd.RDD.reduce(RDD.scala:985)
at org.apache.spark.rdd.RDD$$anonfun$treeAggregate$1.apply(RDD.scala:1114)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:108)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:306)
at org.apache.spark.rdd.RDD.treeAggregate(RDD.scala:1091)
at org.apache.spark.sql.execution.datasources.json.InferSchema$.apply(InferSchema.scala:58)
at org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$6.apply(JSONRelation.scala:105)
at org.apache.spark.sql.execution.datasources.json.JSONRelation$$anonfun$6.apply(JSONRelation.scala:100)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.sql.execution.datasources.json.JSONRelation.dataSchema$lzycompute(JSONRelation.scala:100)
at org.apache.spark.sql.execution.datasources.json.JSONRelation.dataSchema(JSONRelation.scala:99)
at org.apache.spark.sql.sources.HadoopFsRelation.schema$lzycompute(interfaces.scala:561)
at org.apache.spark.sql.sources.HadoopFsRelation.schema(interfaces.scala:560)
at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:31)
at org.apache.spark.sql.SQLContext.baseRelationToDataFrame(SQLContext.scala:389)
at org.apache.spark.sql.DataFrameReader.json(DataFrameReader.scala:245)
at com.truex.prometheus.CLIJob$$anon$1.execute(CLIJob.scala:104)
at com.truex.prometheus.CLIJob$.main(CLIJob.scala:122)
at com.truex.prometheus.CLIJob.main(CLIJob.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.truex.prometheus.RDDExtension$ExtendedStringRDD$$anonfun$sanitizeJSON$1
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1897)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1997)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1921)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:88)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
15/10/23 23:07:51 INFO SparkContext: Invoking stop() from shutdown hook
15/10/23 23:07:51 INFO SparkUI: Stopped Spark web UI at http://10.65.200.150:4040
15/10/23 23:07:51 INFO DAGScheduler: Stopping DAGScheduler
15/10/23 23:07:51 INFO YarnClientSchedulerBackend: Shutting down all executors
15/10/23 23:07:51 INFO YarnClientSchedulerBackend: Interrupting monitor thread
15/10/23 23:07:51 INFO YarnClientSchedulerBackend: Asking each executor to shut down
15/10/23 23:07:51 INFO YarnClientSchedulerBackend: Stopped
15/10/23 23:07:51 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
15/10/23 23:07:51 INFO MemoryStore: MemoryStore cleared
15/10/23 23:07:51 INFO BlockManager: BlockManager stopped
15/10/23 23:07:51 INFO BlockManagerMaster: BlockManagerMaster stopped
15/10/23 23:07:51 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
15/10/23 23:07:51 INFO SparkContext: Successfully stopped SparkContext
15/10/23 23:07:51 INFO ShutdownHookManager: Shutdown hook called
15/10/23 23:07:51 INFO ShutdownHookManager: Deleting directory /tmp/spark-e8264f61-0856-4b33-9aeb-0476a68ee141
15/10/23 23:07:51 INFO ShutdownHookManager: Deleting directory /tmp/spark-c5de504a-faa8-44d0-a2a1-661b30fc16f9
15/10/23 23:07:51 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment