Skip to content

Instantly share code, notes, and snippets.

@cpwais
Created June 20, 2016 20:50
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save cpwais/b83e1d8d79187a5505369bf6c8cad6aa to your computer and use it in GitHub Desktop.
Save cpwais/b83e1d8d79187a5505369bf6c8cad6aa to your computer and use it in GitHub Desktop.
16/06/20 13:42:00 INFO type: initialize(alluxio://host1.my.domain:19998/my_file, Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml). Connecting to Alluxio: alluxio://host1.my.domain:19998/my_file
16/06/20 13:42:00 INFO type: alluxio://host1.my.domain:19998 alluxio://host1.my.domain:19998
16/06/20 13:42:00 INFO type: Loading Alluxio properties from Hadoop configuration: {}
16/06/20 13:42:00 DEBUG type: using override 10.0.2.243
16/06/20 13:42:00 INFO type: open(alluxio://host1.my.domain:19998/my_file, 65536)
16/06/20 13:42:00 DEBUG type: HdfsFileInputStream(/my_file, Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml, 65536, 0 bytes read, 0 bytes written, 1 read ops, 0 large read ops, 0 write ops, {})
16/06/20 13:42:00 INFO type: Alluxio client (version 1.1.0) is trying to connect with FileSystemMasterClient master @ host1.my.domain/10.0.2.243:19998
16/06/20 13:42:00 INFO type: Client registered with FileSystemMasterClient master @ host1.my.domain/10.0.2.243:19998
16/06/20 13:42:00 DEBUG type: Init FileInStream with options InStreamOptions{locationPolicy=LocalFirstPolicy{localHostName=host1.my.domain}, readType=CACHE, cachePartiallyReadBlock=true, seekBufferSize=1048576}
16/06/20 13:42:00 INFO type: Alluxio client (version 1.1.0) is trying to connect with BlockMasterClient master @ host1.my.domain/10.0.2.243:19998
16/06/20 13:42:00 INFO type: Client registered with BlockMasterClient master @ host1.my.domain/10.0.2.243:19998
16/06/20 13:42:00 DEBUG type: Failed to get BlockInStream for block with ID 16777216, using UFS instead. java.io.IOException: Block 16777216 is not available in Alluxio
16/06/20 13:42:00 DEBUG type: Discovered Under File System Factory implementation class alluxio.underfs.hdfs.HdfsUnderFileSystemFactory - alluxio.underfs.hdfs.HdfsUnderFileSystemFactory@e3d020
16/06/20 13:42:00 DEBUG type: Discovered Under File System Factory implementation class alluxio.underfs.local.LocalUnderFileSystemFactory - alluxio.underfs.local.LocalUnderFileSystemFactory@1cee3a80
16/06/20 13:42:00 DEBUG type: Discovered Under File System Factory implementation class alluxio.underfs.s3.S3UnderFileSystemFactory - alluxio.underfs.s3.S3UnderFileSystemFactory@b4c77a1
16/06/20 13:42:00 DEBUG type: Discovered Under File System Factory implementation class alluxio.underfs.gcs.GCSUnderFileSystemFactory - alluxio.underfs.gcs.GCSUnderFileSystemFactory@207538bd
16/06/20 13:42:00 DEBUG type: Discovered Under File System Factory implementation class alluxio.underfs.swift.SwiftUnderFileSystemFactory - alluxio.underfs.swift.SwiftUnderFileSystemFactory@245c2549
16/06/20 13:42:00 DEBUG type: Under File System Factory implementation class alluxio.underfs.local.LocalUnderFileSystemFactory is eligible for path /path/to/my_file
16/06/20 13:42:01 DEBUG type: Failed to get BlockInStream for block with ID 16777216, using UFS instead. java.io.IOException: Block 16777216 is not available in Alluxio
16/06/20 13:42:01 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.RuntimeException: No available Alluxio worker found
at alluxio.client.block.AlluxioBlockStore.getOutStream(AlluxioBlockStore.java:167)
at alluxio.client.file.FileInStream.updateCacheStream(FileInStream.java:473)
at alluxio.client.file.FileInStream.updateStreams(FileInStream.java:416)
at alluxio.client.file.FileInStream.close(FileInStream.java:147)
at alluxio.hadoop.HdfsFileInputStream.close(HdfsFileInputStream.java:115)
at java.io.FilterInputStream.close(FilterInputStream.java:181)
at org.spark-project.guava.io.Closeables.close(Closeables.java:77)
at org.apache.spark.input.PortableDataStream.toArray(PortableDataStream.scala:188)
at $line19.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:30)
at $line19.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$1.apply(<console>:30)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:103)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:47)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:273)
at scala.collection.AbstractIterator.to(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:265)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1157)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:252)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1157)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:927)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:927)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1858)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
at org.apache.spark.scheduler.Task.run(Task.scala:89)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment