Created
January 6, 2016 00:05
-
-
Save stephentu/16901a80959e9be8484e to your computer and use it in GitHub Desktop.
stacktrace.txt
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
sltu@c77:/scratch/stephentu/keystone(master)$ OMP_NUM_THREADS=1 KEYSTONE_MEM=200g time ./bin/run-pipeline.sh pipelines.images.cifar.RandomP | |
atchCifarFeaturizer --trainLocation ./cifar_train.bin --testLocation ./cifar_test.bin --numFilters 1024 --trainOutfile /mnt/sda/stephentu/c | |
ifar_train_featurized_augmented --testOutfile /mnt/sda/stephentu/cifar_test_featurized_augmented | |
SLF4J: Class path contains multiple SLF4J bindings. | |
SLF4J: Found binding in [jar:file:/scratch/stephentu/keystone/target/scala-2.10/keystoneml-assembly-0.3.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class] | |
SLF4J: Found binding in [jar:file:/scratch/stephentu/spark-1.3.1-bin-cdh4/lib/spark-assembly-1.3.1-hadoop2.0.0-mr1-cdh4.2.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] | |
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. | |
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] | |
convolver.resWidth 27 convolver.resHeight 27 | |
nResulting 25 | |
16/01/05 13:58:58 INFO Cacher: CACHING 9 | |
16/01/05 13:58:58 INFO ConcretePipeline: Fitting 'StandardScaler' [8] | |
16/01/05 13:59:00 INFO ConcretePipeline: Finished fitting 'StandardScaler' [8] | |
16/01/05 13:59:00 INFO Cacher: CACHING 16 | |
16/01/05 13:59:00 INFO Cacher: CACHING 17 | |
[Stage 4:> [Stage 4:> [Stage 4:> | |
[Stage 4:> [Stage 4:> | |
[Stage 4:> [Stage 4:> [Stage 4:> | |
[Stage 4:> [Stage 4:> [Sta | |
ge 4:> (0 + 48) / 48] | |
[Stage 4:> (0 + 48) / 48]16/01/05 14:36:52 ERROR Executor: Exception in task 33.0 in stage 4.0 (TID 183) | |
java.lang.IllegalArgumentException: requirement failed: Vectors must have same length: 2048 != 24 | |
at scala.Predef$.require(Predef.scala:233) | |
at breeze.linalg.DenseVector$$anon$4.apply(DenseVector.scala:550) | |
at breeze.linalg.DenseVector$$anon$4.apply(DenseVector.scala:548) | |
at breeze.linalg.operators.DenseVector_GenericOps$$anon$306.apply(DenseVectorOps.scala:581) | |
at breeze.linalg.operators.DenseVector_GenericOps$$anon$306.apply(DenseVectorOps.scala:578) | |
at breeze.linalg.ImmutableNumericOps$class.$minus(NumericOps.scala:55) | |
at breeze.linalg.DenseVector.$minus(DenseVector.scala:50) | |
at nodes.stats.StandardScalerModel.apply(StandardScaler.scala:26) | |
at nodes.stats.StandardScalerModel.apply(StandardScaler.scala:16) | |
at workflow.Transformer$$anonfun$apply$1.apply(Transformer.scala:27) | |
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) | |
at org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:249) | |
at org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:172) | |
at org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:79) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:242) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:244) | |
at org.apache.spark.rdd.ZippedPartitionsRDD2.compute(ZippedPartitionsRDD.scala:88) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:244) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:244) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:244) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) | |
at org.apache.spark.scheduler.Task.run(Task.scala:64) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
16/01/05 14:36:52 ERROR TaskSetManager: Task 33 in stage 4.0 failed 1 times; aborting job |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment