Skip to content

Instantly share code, notes, and snippets.

@Arnold1
Created August 14, 2017 21:03
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Arnold1/59cba6c03e065e7f8ed0b2c67a555f59 to your computer and use it in GitHub Desktop.
Save Arnold1/59cba6c03e065e7f8ed0b2c67a555f59 to your computer and use it in GitHub Desktop.
scalding issue
This file has been truncated, but you can view the full file.
$ ./sbt update
[warn] Executing in batch mode.
[warn] For better performance, hit [ENTER] to switch to interactive mode, or
[warn] consider launching sbt without any commands, or explicitly passing 'shell'
[info] Loading project definition from /Users/geri/work/scalding/project
[info] Updating {file:/Users/geri/work/scalding/project/}scalding-build...
[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.typesafe.sbt:sbt-git:0.6.2 -> 0.8.5
[warn] Run 'evicted' to see detailed eviction warnings
[info] Resolving key references (15830 settings) ...
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[info] Set current project to scalding (in build file:/Users/geri/work/scalding/)
[info] Updating {file:/Users/geri/work/scalding/}scalding...
[info] Done updating.
[success] Total time: 0 s, completed Aug 14, 2017 4:27:12 PM
$ ./sbt test
[warn] Executing in batch mode.
[warn] For better performance, hit [ENTER] to switch to interactive mode, or
[warn] consider launching sbt without any commands, or explicitly passing 'shell'
[info] Loading project definition from /Users/geri/work/scalding/project
[info] Resolving key references (15830 settings) ...
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
[info] Set current project to scalding (in build file:/Users/geri/work/scalding/)
[info] Formatting 2 Scala sources {file:/Users/geri/work/scalding/}scalding-args(test) ...
[info] Formatting 1 Scala source {file:/Users/geri/work/scalding/}scalding-json(test) ...
[info] Formatting 1 Scala source {file:/Users/geri/work/scalding/}scalding-jdbc(test) ...
[info] Formatting 1 Scala source {file:/Users/geri/work/scalding/}scalding-db(test) ...
[info] Formatting 1 Scala source {file:/Users/geri/work/scalding/}scalding-repl(test) ...
[info] Formatting 1 Scala source {file:/Users/geri/work/scalding/}scalding-hraven(test) ...
[info] Formatting 69 Scala sources {file:/Users/geri/work/scalding/}scalding-core(test) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/KryoTest.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1890,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/ExecutionTest.scala: expected start of definition, but was Token(DEF,def,26050,def)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/FieldImpsTest.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,757,<)
[info] Reformatted 2 Scala sources {file:/Users/geri/work/scalding/}scalding-core(test).
[info] Formatting 5 Scala sources {file:/Users/geri/work/scalding/}scalding-date(test) ...
[info] Formatting 4 Scala sources {file:/Users/geri/work/scalding/}scalding-estimators-test(test) ...
[info] Formatting 4 Scala sources {file:/Users/geri/work/scalding/}scalding-thrift-macros(test) ...
[info] Formatting 3 Scala sources {file:/Users/geri/work/scalding/}scalding-hadoop-test(test) ...
[info] Formatting 9 Scala sources {file:/Users/geri/work/scalding/}scalding-commons(test) ...
[info] Formatting 6 Scala sources {file:/Users/geri/work/scalding/}scalding-serialization(test) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/WriterReaderProperties.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1992,<)
[info] Formatting 2 Scala sources {file:/Users/geri/work/scalding/}scalding-parquet-scrooge(test) ...
[info] Formatting 4 Scala sources {file:/Users/geri/work/scalding/}scalding-parquet(test) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/ParquetSourcesTests.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,985,<)
[info] Updating {file:/Users/geri/work/scalding/}scalding-args...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-serialization...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-date...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}maple...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-thrift-macros-fixtures...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-parquet-scrooge-fixtures...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-parquet-fixtures...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-core...
[info] Done updating.
[info] Formatting 2 Scala sources {file:/Users/geri/work/scalding/}scalding-args(compile) ...
[info] Formatting 35 Scala sources {file:/Users/geri/work/scalding/}scalding-serialization(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,762,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/Writer.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,4033,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ProductOrderedBuf.scala: expected start of definition, but was Token(DEF,def,3062,def)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/Boxed.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,32003,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,12363,<)
[info] Formatting 8 Scala sources {file:/Users/geri/work/scalding/}scalding-date(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-date/src/main/scala/com/twitter/scalding/Duration.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,2652,<)
[info] Updating {file:/Users/geri/work/scalding/}scalding-json...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-jdbc...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-db...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-repl...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-hraven...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-avro...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-hadoop-test...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}execution-tutorial...
[info] Done updating.
[info] Formatting 148 Scala sources {file:/Users/geri/work/scalding/}scalding-core(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/source/TypedSequenceFile.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1194,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/GroupBuilder.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,12633,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/GeneratedConversions.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,19031,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/WithDescription.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,871,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Config.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,6859,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/PartitionedTextLine.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,5374,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/WithReducers.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1085,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/source/NullSink.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,505,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/BijectedSourceSink.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1248,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/mathematics/TypedSimilarity.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,3780,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/memory_backend/MemoryBackend.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,669,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/ReduceOperations.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1281,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/TypedSink.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1061,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TupleSetter.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,2110,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/GeneratedFlattenGroup.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,639,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Execution.scala: Expected token RPAREN but got Token(XML_START_OPEN,<,3171,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/FoldOperations.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,823,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/JobTest.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1161,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/PartitionUtil.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,2296,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/XHandler.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,307,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/RichPipe.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,5390,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Job.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,11393,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/FlatMappedFn.scala: Expected token RPAREN but got Token(XML_START_OPEN,<,3315,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/StreamOperations.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,880,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/WritableSequenceFile.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1066,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/MemorySink.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1235,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/FieldConversions.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,6265,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TypedDelimited.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,3578,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/PartitionSchemed.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,2392,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/mathematics/Matrix.scala: Expected identifier, but got Token(XML_START_OPEN,<,6783,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/GeneratedTypedSource.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,6426,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TuplePacker.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1254,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/KeyedList.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,981,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/CascadingBackend.scala: expected start of definition, but was Token(DEF,def,6544,def)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/source/TypedText.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,3949,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/TypedPipe.scala: Expected token RPAREN but got Token(XML_START_OPEN,<,11312,<)
[info] Reformatted 3 Scala sources {file:/Users/geri/work/scalding/}scalding-core(compile).
[info] Updating {file:/Users/geri/work/scalding/}scalding-estimators-test...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-thrift-macros...
[info] Done updating.
[info] Updating {file:/Users/geri/work/scalding/}scalding-commons...
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * org.apache.thrift:libthrift:0.5.0 -> 0.7.0
[warn] Run 'evicted' to see detailed eviction warnings
[info] Updating {file:/Users/geri/work/scalding/}scalding-parquet...
[info] Done updating.
[info] Formatting 2 Scala sources {file:/Users/geri/work/scalding/}scalding-json(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-json/src/main/scala/com/twitter/scalding/TypedJson.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1087,<)
[info] Formatting 5 Scala sources {file:/Users/geri/work/scalding/}scalding-jdbc(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-jdbc/src/main/scala/com/twitter/scalding/jdbc/DriverColumnDefiner.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,841,<)
[info] Formatting 18 Scala sources {file:/Users/geri/work/scalding/}scalding-db(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala: expected start of definition, but was Token(DEF,def,4592,def)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/ColumnFormat.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,881,<)
[info] Formatting 4 Scala sources {file:/Users/geri/work/scalding/}scalding-repl(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-repl/src/main/scala/com/twitter/scalding/ReplImplicits.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,9615,<)
[info] Formatting 3 Scala sources {file:/Users/geri/work/scalding/}scalding-hraven(compile) ...
[info] Reformatted 1 Scala source {file:/Users/geri/work/scalding/}scalding-hraven(compile).
[info] Formatting 3 Scala sources {file:/Users/geri/work/scalding/}scalding-avro(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-avro/src/main/scala/com/twitter/scalding/avro/AvroSource.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,3160,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-avro/src/main/scala/com/twitter/scalding/avro/package.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1084,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-avro/src/main/scala/com/twitter/scalding/avro/SchemaType.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1880,<)
[info] Formatting 7 Scala sources {file:/Users/geri/work/scalding/}scalding-hadoop-test(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-hadoop-test/src/main/scala/com/twitter/scalding/platform/HadoopPlatform.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,252,<)
[info] Updating {file:/Users/geri/work/scalding/}scalding-parquet-scrooge...
[warn] Multiple dependencies with the same organization/name but different versions. To avoid conflict, pick one version:
[warn] * com.novocode:junit-interface:(0.10, 0.11)
[info] Done updating.
[info] Formatting 8 Scala sources {file:/Users/geri/work/scalding/}scalding-thrift-macros(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeEnumOrderedBuf.scala: expected start of definition, but was Token(VAL,val,974,val)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOuterOrderedBuf.scala: expected start of definition, but was Token(VAL,val,1496,val)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala: expected start of definition, but was Token(VAL,val,1613,val)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeUnionOrderedBuf.scala: expected start of definition, but was Token(VAL,val,1159,val)
[info] Formatting 21 Scala sources {file:/Users/geri/work/scalding/}scalding-commons(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/HourlySources.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1390,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/LzoTraits.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1209,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/LongThriftTransformer.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,928,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/VersionedKeyValSource.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,2851,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/BinaryConverters.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1237,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/DailySources.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1247,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/FixedPathSources.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,735,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/LzoGenericSource.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1050,<)
[info] Formatting 13 Scala sources {file:/Users/geri/work/scalding/}scalding-parquet(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/thrift/ParquetThrift.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1095,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/thrift/PartitionedParquetThriftSource.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1128,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/TypedParquet.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,2448,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/thrift/Parquet346TBaseScheme.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1424,<)
[info] Generating scrooge thrift for /Users/geri/work/scalding/scalding-thrift-macros-fixtures/src/test/resources/test.thrift ...
[info] Generating scrooge thrift for /Users/geri/work/scalding/scalding-parquet-fixtures/src/test/resources/test.thrift ...
[info] Formatting 3 Scala sources {file:/Users/geri/work/scalding/}scalding-parquet-scrooge(compile) ...
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/scala/com/twitter/scalding/parquet/scrooge/Parquet346ScroogeScheme.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1337,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/scala/com/twitter/scalding/parquet/scrooge/ParquetScrooge.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,350,<)
[warn] Scalariform parser error for /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/scala/com/twitter/scalding/parquet/scrooge/PartitionedParquetScroogeSource.scala: Expected token RBRACKET but got Token(XML_START_OPEN,<,1179,<)
[info] Generating scrooge thrift for /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/src/test/resources/binary.thrift, /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/src/test/resources/compat.thrift, /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/src/test/resources/test.thrift ...
[info] Compiling 2 Scala sources to /Users/geri/work/scalding/scalding-args/target/scala-2.11/classes...
[info] Compiling 10 Java sources to /Users/geri/work/scalding/maple/target/classes...
[warn] bootstrap class path not set in conjunction with -source 1.6
[info] /Users/geri/work/scalding/maple/src/main/java/com/twitter/maple/hbase/HBaseScheme.java: Some input files use or override a deprecated API.
[info] /Users/geri/work/scalding/maple/src/main/java/com/twitter/maple/hbase/HBaseScheme.java: Recompile with -Xlint:deprecation for details.
[info] /Users/geri/work/scalding/maple/src/main/java/com/twitter/maple/hbase/HBaseScheme.java: Some input files use unchecked or unsafe operations.
[info] /Users/geri/work/scalding/maple/src/main/java/com/twitter/maple/hbase/HBaseScheme.java: Recompile with -Xlint:unchecked for details.
[info] Compiling 35 Scala sources and 1 Java source to /Users/geri/work/scalding/scalding-serialization/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/Boxed.scala:836: [UseGetOrElseNotPatMatch] ... match { Some(x) => x; None => {
[warn] val r: (Any => com.twitter.scalding.serialization.Boxed[Any], Class[com.twitter.scalding.serialization.Boxed[Any]]) = Boxed.this.next[Any]();
[warn] Boxed.this.boxedCache.putIfAbsent(cls, r);
[warn] r
[warn] }} can be replaced with .getOrElse({
[warn] val r: (Any => com.twitter.scalding.serialization.Boxed[Any], Class[com.twitter.scalding.serialization.Boxed[Any]]) = Boxed.this.next[Any]();
[warn] Boxed.this.boxedCache.putIfAbsent(cls, r);
[warn] r
[warn] })
[warn] val untypedRes = Option(boxedCache.get(cls)) match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/JavaStreamEnrichments.scala:248: [IdenticalStatements] You're doing the exact same thing twice or more.
[warn] s.write(-1)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseClassOrderedBuf.scala:44: [TypeToType] Using toTermName on something that is already of type TermName.
[warn] (fieldType, accessorMethod.name.toTermName, b)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseClassOrderedBuf.scala:70: [MergeMaps] Merge these two map operations.
[warn] q"""
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ProductOrderedBuf.scala:79: [TypeToType] Using toTermName on something that is already of type TermName.
[warn] .filter(m => m.name.toTermName.toString.startsWith("_"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ProductOrderedBuf.scala:83: [TypeToType] Using toTermName on something that is already of type TermName.
[warn] (fieldType, accessorMethod.name.toTermName, b)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ProductOrderedBuf.scala:109: [MergeMaps] Merge these two map operations.
[warn] q"""
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/SealedTraitOrderedBuf.scala:51: [TypeToType] Using toList on something that is already of type List.
[warn] val subClasses: List[Type] = knownDirectSubclasses.map(_.asType.toType).toList
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/SealedTraitOrderedBuf.scala:55: [TypeToType] Using toList on something that is already of type List.
[warn] }.zipWithIndex.map{ case ((tpe, tbuf), idx) => (idx, tpe, tbuf) }.toList
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/OrderedBufferableProviderImpl.scala:27: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def normalizedDispatcher(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/OrderedBufferableProviderImpl.scala:28: method normalize in class TypeApi is deprecated: Use `dealias` or `etaExpand` instead
[warn] case tpe if !(tpe.normalize == tpe) => buildDispatcher(tpe.normalize)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/OrderedBufferableProviderImpl.scala:28: method normalize in class TypeApi is deprecated: Use `dealias` or `etaExpand` instead
[warn] case tpe if !(tpe.normalize == tpe) => buildDispatcher(tpe.normalize)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/OrderedBufferableProviderImpl.scala:31: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def scaldingBasicDispatchers(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/OrderedBufferableProviderImpl.scala:59: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def fallbackImplicitDispatcher(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/OrderedBufferableProviderImpl.scala:63: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] private def outerDispatcher(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/OrderedBufferableProviderImpl.scala:72: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] private def innerDispatcher(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/OrderedBufferableProviderImpl.scala:77: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[OrderedSerialization[T]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala:21: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] sealed trait CompileTimeLengthTypes[C <: Context] {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala:29: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(tree: c.Tree): FastLengthCalculation[c.type] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala:36: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] trait FastLengthCalculation[C <: Context] extends CompileTimeLengthTypes[C]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala:39: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(tree: c.Tree): MaybeLengthCalculation[c.type] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala:46: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] trait MaybeLengthCalculation[C <: Context] extends CompileTimeLengthTypes[C]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala:49: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(intArg: Int): ConstantLengthCalculation[c.type] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala:60: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] trait ConstantLengthCalculation[C <: Context] extends CompileTimeLengthTypes[C] {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala:65: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context): NoLengthCalculationAvailable[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/CompileTimeLengthTypes.scala:76: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] trait NoLengthCalculationAvailable[C <: Context] extends CompileTimeLengthTypes[C]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:24: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def compareBinary(c: Context)(inputStreamA: c.TermName, inputStreamB: c.TermName)(elementData: List[(c.universe.Type, c.universe.TermName, TreeOrderedBuf[c.type])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:26: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:26: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:47: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def hash(c: Context)(element: c.TermName)(elementData: List[(c.universe.Type, c.universe.TermName, TreeOrderedBuf[c.type])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:49: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:49: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:69: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def put(c: Context)(inputStream: c.TermName, element: c.TermName)(elementData: List[(c.universe.Type, c.universe.TermName, TreeOrderedBuf[c.type])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:71: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:71: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:84: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def length(c: Context)(element: c.Tree)(elementData: List[(c.universe.Type, c.universe.TermName, TreeOrderedBuf[c.type])]): CompileTimeLengthTypes[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:135: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def compare(c: Context)(elementA: c.TermName, elementB: c.TermName)(elementData: List[(c.universe.Type, c.universe.TermName, TreeOrderedBuf[c.type])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:138: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/ProductLike.scala:138: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:38: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def compareBinary(c: Context)(inputStreamA: c.TermName, inputStreamB: c.TermName)(subData: List[(Int, c.Type, TreeOrderedBuf[c.type])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:40: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:40: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:84: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def hash(c: Context)(element: c.TermName)(subData: List[(Int, c.Type, TreeOrderedBuf[c.type])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:86: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:86: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:119: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def put(c: Context)(inputStream: c.TermName, element: c.TermName)(subData: List[(Int, c.Type, TreeOrderedBuf[c.type])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:121: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:121: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:153: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def length(c: Context)(element: c.Tree)(subData: List[(Int, c.Type, TreeOrderedBuf[c.type])]): CompileTimeLengthTypes[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:156: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:156: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:212: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def get(c: Context)(inputStream: c.TermName)(subData: List[(Int, c.Type, TreeOrderedBuf[c.type])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:214: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:214: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:250: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def compare(c: Context)(cmpType: c.Type, elementA: c.TermName, elementB: c.TermName)(subData: List[(Int, c.Type, TreeOrderedBuf[c.type])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:253: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/SealedTraitLike.scala:253: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:71: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def toOrderedSerialization[T](c: Context)(t: TreeOrderedBuf[c.type])(implicit T: t.ctx.WeakTypeTag[T]): t.ctx.Expr[OrderedSerialization[T]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:73: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"fresh_$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:73: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"fresh_$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:216: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val termName = newTermName(n)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:261: method newTermName in trait Names is deprecated: Use TermName instead
[warn] ${t.hash(newTermName("passedInObjectToHash"))}
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:285: method newTermName in trait Names is deprecated: Use TermName instead
[warn] ${discardLength(newTermName("from"))}
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:286: method newTermName in trait Names is deprecated: Use TermName instead
[warn] _root_.scala.util.Success(${t.get(newTermName("from"))})
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:294: method newTermName in trait Names is deprecated: Use TermName instead
[warn] ${putFnGen(newTermName("into"), newTermName("e"))}
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:294: method newTermName in trait Names is deprecated: Use TermName instead
[warn] ${putFnGen(newTermName("into"), newTermName("e"))}
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:302: method newTermName in trait Names is deprecated: Use TermName instead
[warn] ${t.compare(newTermName("x"), newTermName("y"))}
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:302: method newTermName in trait Names is deprecated: Use TermName instead
[warn] ${t.compare(newTermName("x"), newTermName("y"))}
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/TreeOrderedBuf.scala:309: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] abstract class TreeOrderedBuf[C <: Context] {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ByteBufferOrderedBuf.scala:29: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ByteBufferOrderedBuf.scala:33: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ByteBufferOrderedBuf.scala:36: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ByteBufferOrderedBuf.scala:36: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseClassOrderedBuf.scala:27: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseClassOrderedBuf.scala:32: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]], outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseClassOrderedBuf.scala:34: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseClassOrderedBuf.scala:34: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseClassOrderedBuf.scala:39: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseClassOrderedBuf.scala:72: method companionSymbol in trait SymbolApi is deprecated: Use `companion` instead, but beware of possible changes in behavior
[warn] ${outerType.typeSymbol.companionSymbol}(..${getValProcessor.map(_._2)})
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseObjectOrderedBuf.scala:27: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context)(): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseObjectOrderedBuf.scala:32: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/CaseObjectOrderedBuf.scala:43: method companionSymbol in trait SymbolApi is deprecated: Use `companion` instead, but beware of possible changes in behavior
[warn] override def get(inputStream: ctx.TermName): ctx.Tree = q"${outerType.typeSymbol.companionSymbol}"
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/EitherOrderedBuf.scala:27: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/EitherOrderedBuf.scala:31: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]], outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/EitherOrderedBuf.scala:33: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/EitherOrderedBuf.scala:33: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ImplicitOrderedBuf.scala:31: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ImplicitOrderedBuf.scala:39: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ImplicitOrderedBuf.scala:41: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ImplicitOrderedBuf.scala:41: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ImplicitOrderedBuf.scala:45: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val variableName = newTermName(variableNameStr)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/OptionOrderedBuf.scala:27: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/OptionOrderedBuf.scala:31: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]], outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/OptionOrderedBuf.scala:33: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/OptionOrderedBuf.scala:33: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/PrimitiveOrderedBuf.scala:28: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/PrimitiveOrderedBuf.scala:63: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(outerType: c.Type,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/PrimitiveOrderedBuf.scala:68: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val javaType = newTermName(javaTypeStr)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/PrimitiveOrderedBuf.scala:70: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"fresh_$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/PrimitiveOrderedBuf.scala:70: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"fresh_$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/PrimitiveOrderedBuf.scala:75: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val bbGetter = newTermName("read" + shortName)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/PrimitiveOrderedBuf.scala:76: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val bbPutter = newTermName("write" + shortName)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/PrimitiveOrderedBuf.scala:82: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val primitiveAccessor = newTermName(shortName.toLowerCase + "Value")
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/PrimitiveOrderedBuf.scala:94: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val typeLowerCase = newTermName(javaTypeStr.toLowerCase)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ProductOrderedBuf.scala:28: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ProductOrderedBuf.scala:70: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]], originalType: c.Type, outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ProductOrderedBuf.scala:72: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ProductOrderedBuf.scala:72: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/ProductOrderedBuf.scala:77: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/SealedTraitOrderedBuf.scala:24: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/SealedTraitOrderedBuf.scala:28: method isAbstractClass in trait ClassSymbolApi is deprecated: Use isAbstract instead
[warn] case tpe if (tpe.typeSymbol.isClass && (tpe.typeSymbol.asClass.isAbstractClass || tpe.typeSymbol.asClass.isTrait)) => SealedTraitOrderedBuf(c)(buildDispatcher, tpe)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/SealedTraitOrderedBuf.scala:33: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]], outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/SealedTraitOrderedBuf.scala:35: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/SealedTraitOrderedBuf.scala:35: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/StringOrderedBuf.scala:28: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/StringOrderedBuf.scala:32: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/StringOrderedBuf.scala:35: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/StringOrderedBuf.scala:35: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/TraversablesOrderedBuf.scala:40: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/TraversablesOrderedBuf.scala:69: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]],
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/TraversablesOrderedBuf.scala:75: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"fresh_$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/TraversablesOrderedBuf.scala:75: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"fresh_$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/TraversablesOrderedBuf.scala:79: method companionSymbol in trait SymbolApi is deprecated: Use `companion` instead, but beware of possible changes in behavior
[warn] val companionSymbol = outerType.typeSymbol.companionSymbol
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/TraversablesOrderedBuf.scala:87: method apply in class TypeRefExtractor is deprecated: Use `internal.typeRef` instead
[warn] TypeRef.apply(containerType.pre, containerType.sym, List(tpe1, tpe2))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/UnitOrderedBuf.scala:28: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/UnitOrderedBuf.scala:32: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/OrderedBufferableProviderImpl.scala:27: [UnusedParameter] Parameter c is not used in method normalizedDispatcher.
[warn] def normalizedDispatcher(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/main/scala/com/twitter/scalding/serialization/macros/impl/ordered_serialization/providers/StableKnownDirectSubclasses.scala:16: [UnusedParameter] Parameter c is not used in method apply.
[warn] def apply(c: Context)(tpe: c.Type): List[c.universe.TypeSymbol] =
[warn] ^
[warn] 128 warnings found
[warn] bootstrap class path not set in conjunction with -source 1.6
[info] Compiling 8 Scala sources to /Users/geri/work/scalding/scalding-date/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-date/src/main/scala/com/twitter/scalding/CalendarOps.scala:21: [IdenticalCaseBodies] Bodies of 6 neighbouring cases are identical and could be merged.
[warn] case Calendar.HOUR_OF_DAY => () // Skip
[warn] ^
[warn] one warning found
[info] Compiling 2 Scala sources to /Users/geri/work/scalding/scalding-args/target/scala-2.11/test-classes...
[info] Compiling 6 Scala sources to /Users/geri/work/scalding/scalding-serialization/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:364: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] primitiveOrderedBufferSupplier[TestSealedAbstractClass]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:365: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] check[TestSealedAbstractClass]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:366: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] checkMany[TestSealedAbstractClass]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:367: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] checkCollisions[TestSealedAbstractClass]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:605: [DuplicateIfBranches] If statement branches have the same structure.
[warn] val oser = primitiveOrderedBufferSupplier[Either[Int, Int]]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:607: [DuplicateIfBranches] If statement branches have the same structure.
[warn] check[Either[Int, Int]]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:608: [DuplicateIfBranches] If statement branches have the same structure.
[warn] checkCollisions[Either[Int, Int]]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:661: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] primitiveOrderedBufferSupplier[SealedTraitTest]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:662: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] check[SealedTraitTest]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:663: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] checkMany[SealedTraitTest]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:664: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] checkCollisions[SealedTraitTest]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:310: [UnusedParameter] Parameter fresh_element$macro$32 is not used in method noLengthWrite.
[warn] primitiveOrderedBufferSupplier[Unit]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:311: [UnusedParameter] Parameter fresh_element$macro$53 is not used in method noLengthWrite.
[warn] check[Unit]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:312: [UnusedParameter] Parameter fresh_element$macro$74 is not used in method noLengthWrite.
[warn] checkMany[Unit]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:669: [UnusedParameter] Parameter fresh_element$macro$9728 is not used in method noLengthWrite.
[warn] primitiveOrderedBufferSupplier[TestObjectE.type]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:670: [UnusedParameter] Parameter fresh_element$macro$9749 is not used in method noLengthWrite.
[warn] check[TestObjectE.type]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-serialization/src/test/scala/com/twitter/scalding/serialization/macros/MacroOrderingProperties.scala:671: [UnusedParameter] Parameter fresh_element$macro$9770 is not used in method noLengthWrite.
[warn] checkMany[TestObjectE.type]
[warn] ^
[warn] 17 warnings found
[info] Compiling 5 Scala sources to /Users/geri/work/scalding/scalding-date/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-date/src/test/scala/com/twitter/scalding/DateProperties.scala:122: [YodaConditions] Yoda conditions using you are.
[warn] (false == ex.contains(upper)) &&
[warn] ^
[warn] /Users/geri/work/scalding/scalding-date/src/test/scala/com/twitter/scalding/DateTest.scala:341: [UseCountNotFilterLength] Use globed.count(...) instead of globed.filter(...).size
[warn] assert(splits.map { path => globed.filter { globMatchesDate(_)(path) }.size }
[warn] ^
[warn] two warnings found
[info] Compiling 148 Scala sources and 2 Java sources to /Users/geri/work/scalding/scalding-core/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/HfsConfPropertySetter.scala:57: @deprecated now takes two arguments; see the scaladoc.
[warn] @deprecated("Tap config is deprecated, use sourceConfig or sinkConfig directly. In cascading configs applied to sinks can leak to sources in the step writing to the sink.")
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:37: no valid targets for annotation on value fn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class FlatMapFunction[S, T](@transient fn: S => TraversableOnce[T], fields: Fields,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:55: no valid targets for annotation on value fn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class MapFunction[S, T](@transient fn: S => T, fields: Fields,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:76: no valid targets for annotation on value fn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class CleanupIdentityFunction(@transient fn: () => Unit)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:89: no valid targets for annotation on value fn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class CollectFunction[S, T](@transient fn: PartialFunction[S, T], fields: Fields,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:137: no valid targets for annotation on value commutativeSemigroup - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient commutativeSemigroup: Semigroup[V],
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:196: no valid targets for annotation on value fn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient fn: TupleEntry => TraversableOnce[(K, V)],
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:197: no valid targets for annotation on value commutativeSemigroup - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient commutativeSemigroup: Semigroup[V],
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:388: no valid targets for annotation on value bf - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient bf: => C, // begin function returns a context
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:389: no valid targets for annotation on value ef - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient ef: C => Unit, // end function to clean up context object
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:407: no valid targets for annotation on value fn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient fn: (C, S) => T, // function that takes a context and a tuple and generate a new tuple
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:427: no valid targets for annotation on value fn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient fn: (C, S) => TraversableOnce[T], // function that takes a context and a tuple, returns TraversableOnce of T
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:441: no valid targets for annotation on value fn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class FilterFunction[T](@transient fn: T => Boolean, conv: TupleConverter[T])
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:452: no valid targets for annotation on value fn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class FoldAggregator[T, X](@transient fn: (X, T) => X, @transient init: X, fields: Fields,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:452: no valid targets for annotation on value init - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class FoldAggregator[T, X](@transient fn: (X, T) => X, @transient init: X, fields: Fields,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:478: no valid targets for annotation on value inputFsmf - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient inputFsmf: T => X,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:479: no valid targets for annotation on value inputRfn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient inputRfn: (X, X) => X,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:480: no valid targets for annotation on value inputMrfn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient inputMrfn: X => U,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:578: no valid targets for annotation on value inputMrfn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient inputMrfn: T => X,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:579: no valid targets for annotation on value inputRfn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient inputRfn: (X, X) => X,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:613: no valid targets for annotation on value init - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient init: I,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:614: no valid targets for annotation on value inputIterfn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient inputIterfn: (I, Iterator[T]) => TraversableOnce[X],
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:632: no valid targets for annotation on value init - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient init: I,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:634: no valid targets for annotation on value inputIterfn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient inputIterfn: (I, C, Iterator[T]) => TraversableOnce[X],
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Operations.scala:670: no valid targets for annotation on value reduceFn - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient reduceFn: (K, Iterator[V]) => Iterator[U],
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/ReferencedClassFinder.scala:82: abstract type pattern reflect.runtime.universe.TypeRef is unchecked since it is eliminated by erasure
[warn] case TypeRef(_, _, args) =>
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/ReferencedClassFinder.scala:92: abstract type pattern reflect.runtime.universe.NullaryMethodType is unchecked since it is eliminated by erasure
[warn] case NullaryMethodType(resultType) => getClassesForType(mirror, resultType)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/serialization/KryoHadoop.scala:25: no valid targets for annotation on value config - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class KryoHadoop(@transient config: Config) extends KryoInstantiator {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/source/CodecSource.scala:49: no valid targets for annotation on value injection - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class CodecSource[T] private (val hdfsPaths: Seq[String], val maxFailures: Int = 0)(implicit @transient injection: Injection[T, Array[Byte]])
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/BijectedSourceSink.scala:31: no valid targets for annotation on value transformer - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class BijectedSourceSink[T, U](parent: BijectedSourceSink.SourceSink[T])(implicit @transient transformer: ImplicitBijection[T, U]) extends TypedSource[U] with TypedSink[U] {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/CascadingBackend.scala:569: abstract type V1 in type pattern com.twitter.scalding.serialization.OrderedSerialization[V1] is unchecked since it is eliminated by erasure
[warn] case ordser: OrderedSerialization[V1] =>
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/CoGroupJoiner.scala:11: no valid targets for annotation on value inJoinFunction - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient inJoinFunction: (K, Iterator[Any], Seq[Iterable[Any]]) => Iterator[Any]) extends CJoiner {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/DistinctCoGroupJoiner.scala:8: no valid targets for annotation on value joinF - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] @transient joinF: (K, Iterator[Any], Seq[Iterable[Any]]) => Iterator[Any])
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Config.scala:553: [TypeToType] Using toMap on something that is already of type Map.
[warn] val asMap = conf.toMap.toMap[K, V] // linter:ignore we are upcasting K, V
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Config.scala:562: [TypeToType] Using toMap on something that is already of type Map.
[warn] m ++ (conf.toMap.toMap[K, V]) // linter:ignore we are upcasting K, V
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Job.scala:197: [TypeToType] Using toMap on something that is already of type Map.
[warn] .toMap.toMap[AnyRef, AnyRef] // linter:ignore the second one is to lift from String -> AnyRef
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/Tool.scala:129: [UseOptionForeachNotPatMatch] ... match { Some(x) => start(nextj, cnt.+(1)); None => {} } can be replaced with .foreach(start(nextj, cnt.+(1)))
[warn] j.next match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/bdd/PipeOperationsConversions.scala:40: [TypeToType] Using toList on something that is already of type List.
[warn] def apply(pipes: List[RichPipe]): Pipe = op(pipes.map(_.pipe).toList)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/FieldsProviderImpl.scala:97: [UseOptionExistsNotPatMatch] ... match { Some(x) => false; None => isNumbered(t)} can be replaced with .exists(isNumbered(t))
[warn] optionInner(c)(tpe) match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/FieldsProviderImpl.scala:166: [TypeToType] Using toTermName on something that is already of type TermName.
[warn] val fieldName = accessorMethod.name.toTermName.toString
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/mathematics/Matrix2.scala:345: [UseGetOrElseNotPatMatch] ... match { Some(x) => x; None => {
[warn] val result: com.twitter.scalding.TypedPipe[(R, C2, V)] = Product.this.computePipe(Product.this.computePipe$default$1);
[warn] m.put(this, result);
[warn] result
[warn] }} can be replaced with .getOrElse({
[warn] val result: com.twitter.scalding.TypedPipe[(R, C2, V)] = Product.this.computePipe(Product.this.computePipe$default$1);
[warn] m.put(this, result);
[warn] result
[warn] })
[warn] case Some(m) => m.get(this) match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/FlatMappedFn.scala:77: [UndesirableTypeInference] Inferred type Any => TraversableOnce[B1]. (This might not be what you've intended)
[warn] val next = loop(rest)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/FlatMappedFn.scala:80: [UndesirableTypeInference] Inferred type Any => TraversableOnce[B1]. (This might not be what you've intended)
[warn] val next = loop(rest)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/Grouped.scala:117: [UndesirableTypeInference] Inferred type Iterable[Any]. (This might not be what you've intended)
[warn] val smallerHead = rightSeq.head
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/CoGroupJoiner.scala:40: [UndesirableTypeInference] Inferred type Iterator[Any]. (This might not be what you've intended)
[warn] val leftMost = unbox(iters.head)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/memory_backend/MemoryBackend.scala:310: [UndesirableTypeInference] Inferred type com.twitter.scalding.typed.memory_backend.MemoryPlanner.Op[Any]. (This might not be what you've intended)
[warn] val (m1, op) = plan(m, left)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/memory_backend/MemoryBackend.scala:349: [UndesirableTypeInference] Inferred type com.twitter.scalding.typed.memory_backend.MemoryPlanner.Op[Any]. (This might not be what you've intended)
[warn] val (m1, op) = plan(m, prev)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/memory_backend/MemoryBackend.scala:373: [UndesirableTypeInference] Inferred type com.twitter.scalding.typed.memory_backend.MemoryPlanner.Op[Any]. (This might not be what you've intended)
[warn] val (m1, op) = plan(m, input)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/memory_backend/MemoryBackend.scala:548: [UndesirableTypeInference] Inferred type com.twitter.scalding.typed.memory_backend.MemoryPlanner.Op[Any]. (This might not be what you've intended)
[warn] val (nextM, op) = plan(oldState.memo, pipe)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/HfsConfPropertySetter.scala:69: method tapConfig in trait HfsConfPropertySetter is deprecated: Tap config is deprecated, use sourceConfig or sinkConfig directly. In cascading configs applied to sinks can leak to sources in the step writing to the sink.
[warn] (tapConfig, tapConfig)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/HfsConfPropertySetter.scala:69: method tapConfig in trait HfsConfPropertySetter is deprecated: Tap config is deprecated, use sourceConfig or sinkConfig directly. In cascading configs applied to sinks can leak to sources in the step writing to the sink.
[warn] (tapConfig, tapConfig)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/ReferencedClassFinder.scala:47: method stringToTermName in trait Names is deprecated: Use explicit `TermName(s)` instead
[warn] scalaSignature = scalaType.member(universe.stringToTermName(field.getName)).typeSignature
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TemplateSource.scala:53: class TemplateTap in package local is deprecated: see corresponding Javadoc for more information.
[warn] new LTemplateTap(localTap, template, pathFields)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TemplateSource.scala:57: class TemplateTap in package hadoop is deprecated: see corresponding Javadoc for more information.
[warn] new HTemplateTap(hfsTap, template, pathFields)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TemplateSource.scala:61: class TemplateTap in package hadoop is deprecated: see corresponding Javadoc for more information.
[warn] new HTemplateTap(hfsTap, template, pathFields)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TypedDelimited.scala:32: class FixedPathTypedDelimited in package scalding is deprecated: Use FixedTypedText instead
[warn] def apply[T: Manifest: TupleConverter: TupleSetter](path: String): FixedPathTypedDelimited[T] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TypedDelimited.scala:35: class FixedPathTypedDelimited in package scalding is deprecated: Use FixedTypedText instead
[warn] def apply[T: Manifest: TupleConverter: TupleSetter](paths: Seq[String]): FixedPathTypedDelimited[T] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TypedDelimited.scala:40: class FixedPathTypedDelimited in package scalding is deprecated: Use FixedTypedText instead
[warn] def apply[T: Manifest: TupleConverter: TupleSetter](path: String, f: Fields): FixedPathTypedDelimited[T] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TypedDelimited.scala:43: class FixedPathTypedDelimited in package scalding is deprecated: Use FixedTypedText instead
[warn] def apply[T: Manifest: TupleConverter: TupleSetter](paths: Seq[String], f: Fields): FixedPathTypedDelimited[T] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TypedDelimited.scala:44: class FixedPathTypedDelimited in package scalding is deprecated: Use FixedTypedText instead
[warn] new FixedPathTypedDelimited[T](paths, f, skipHeader, writeHeader, separator)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/filecache/DistributedCacheFile.scala:116: method addCacheFile in object DistributedCache is deprecated: see corresponding Javadoc for more information.
[warn] HDistributedCache.addCacheFile(symlinkedUriFor(hadoopFile.sourceUri), hadoopMode.jobConf)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassBasedSetterImpl.scala:31: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply[T](c: Context)(container: c.TermName, allowUnknownTypes: Boolean,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassBasedSetterImpl.scala:57: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val someVal = newTermName(c.fresh("someVal"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassBasedSetterImpl.scala:57: method fresh in trait Names is deprecated: Use freshName instead
[warn] val someVal = newTermName(c.fresh("someVal"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassBasedSetterImpl.scala:72: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val cca = newTermName(c.fresh("access"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassBasedSetterImpl.scala:72: method fresh in trait Names is deprecated: Use freshName instead
[warn] val cca = newTermName(c.fresh("access"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassBasedSetterImpl.scala:83: method normalize in class TypeApi is deprecated: Use `dealias` or `etaExpand` instead
[warn] val norm = tpe.normalize
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassBasedSetterImpl.scala:114: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassFieldSetter.scala:30: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def absent(c: Context)(idx: Int, container: c.TermName): c.Tree
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassFieldSetter.scala:33: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def default(c: Context)(idx: Int, container: c.TermName, fieldValue: c.Tree): c.Tree
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/CaseClassFieldSetter.scala:37: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def from(c: Context)(fieldType: c.Type, idx: Int, container: c.TermName, fieldValue: c.Tree): Try[c.Tree]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/FieldsProviderImpl.scala:66: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def toFieldsImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[cascading.tuple.Fields] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/FieldsProviderImpl.scala:69: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def toFieldsWithUnknownImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[cascading.tuple.Fields] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/FieldsProviderImpl.scala:72: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def toFieldsWithUnknownNoPrefixImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[cascading.tuple.Fields] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/FieldsProviderImpl.scala:75: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def toIndexedFieldsImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[cascading.tuple.Fields] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/FieldsProviderImpl.scala:78: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def toIndexedFieldsWithUnknownImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[cascading.tuple.Fields] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/FieldsProviderImpl.scala:81: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def toFieldsCommonImpl[T](c: Context, namingScheme: NamingScheme, allowUnknownTypes: Boolean)(implicit T: c.WeakTypeTag[T]): c.Expr[cascading.tuple.Fields] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/FieldsProviderImpl.scala:163: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleConverterImpl.scala:31: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassTupleConverterImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[TupleConverter[T]] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleConverterImpl.scala:34: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassTupleConverterWithUnknownImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[TupleConverter[T]] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleConverterImpl.scala:37: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassTupleConverterCommonImpl[T](c: Context, allowUnknownTypes: Boolean)(implicit T: c.WeakTypeTag[T]): c.Expr[TupleConverter[T]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleConverterImpl.scala:44: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleConverterImpl.scala:77: method companionSymbol in trait SymbolApi is deprecated: Use `companion` instead, but beware of possible changes in behavior
[warn] q"${tpe.typeSymbol.companionSymbol}(..$trees)"
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleFieldSetter.scala:27: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] override def absent(c: Context)(idx: Int, container: c.TermName): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleFieldSetter.scala:37: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] override def default(c: Context)(idx: Int, container: c.TermName, fieldValue: c.Tree): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleFieldSetter.scala:42: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] override def from(c: Context)(fieldType: c.Type, idx: Int, container: c.TermName, fieldValue: c.Tree): Try[c.Tree] = Try {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleSetterImpl.scala:30: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassTupleSetterImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[TupleSetter[T]] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleSetterImpl.scala:33: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassTupleSetterWithUnknownImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[TupleSetter[T]] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleSetterImpl.scala:36: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassTupleSetterCommonImpl[T](c: Context, allowUnknownTypes: Boolean)(implicit T: c.WeakTypeTag[T]): c.Expr[TupleSetter[T]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleSetterImpl.scala:39: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val tupTerm = newTermName(c.fresh("tup"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TupleSetterImpl.scala:39: method fresh in trait Names is deprecated: Use freshName instead
[warn] val tupTerm = newTermName(c.fresh("tup"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TypeDescriptorProviderImpl.scala:31: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassTypeDescriptorImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[TypeDescriptor[T]] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TypeDescriptorProviderImpl.scala:34: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassTypeDescriptorWithUnknownImpl[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[TypeDescriptor[T]] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TypeDescriptorProviderImpl.scala:49: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def evidentColumn(c: Context, allowUnknown: Boolean = false)(tpe: c.universe.Type): Option[Int] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TypeDescriptorProviderImpl.scala:53: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] t.declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TypeDescriptorProviderImpl.scala:93: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def optionInner(c: Context)(opt: c.universe.Type): Option[c.universe.Type] =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TypeDescriptorProviderImpl.scala:98: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def isTuple[T](c: Context)(implicit T: c.WeakTypeTag[T]): Boolean = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TypeDescriptorProviderImpl.scala:125: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassTypeDescriptorCommonImpl[T](c: Context, allowUnknownTypes: Boolean)(implicit T: c.WeakTypeTag[T]): c.Expr[TypeDescriptor[T]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/source/DailySources.scala:51: trait TypedDelimited in package scalding is deprecated: Use TypedTextDelimited instead
[warn] extends DailySuffixSource(prefix, dateRange) with TypedDelimited[T]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/source/HourlySources.scala:41: trait TypedDelimited in package scalding is deprecated: Use TypedTextDelimited instead
[warn] extends HourlySuffixSource(prefix, dateRange) with TypedDelimited[T]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/TypedPipe.scala:420: [UnusedParameter] Parameter fresh_element$macro$208 is not used in method noLengthWrite.
[warn] def groupAll: Grouped[Unit, T] = groupBy(x => ())(ordSer[Unit]).withReducers(1)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/AsyncFlowDefRunner.scala:310: [UnusedParameter] Parameter pipe is not used in method forceToDisk.
[warn] private def forceToDisk[T](
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/AsyncFlowDefRunner.scala:256: The outer reference in this type test cannot be checked at run time.
[warn] case TypedPipe.IterablePipe(_) => ()
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/AsyncFlowDefRunner.scala:257: The outer reference in this type test cannot be checked at run time.
[warn] case TypedPipe.SourcePipe(src: Mappable[A]) => ()
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/AsyncFlowDefRunner.scala:299: The outer reference in this type test cannot be checked at run time.
[warn] case TypedPipe.IterablePipe(iter) => Future.successful(iter)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/typed/cascading_backend/AsyncFlowDefRunner.scala:300: The outer reference in this type test cannot be checked at run time.
[warn] case TypedPipe.SourcePipe(src: Mappable[T]) =>
[warn] ^
[warn] there were 9 feature warnings; re-run with -feature for details
[warn] 107 warnings found
[warn] bootstrap class path not set in conjunction with -source 1.6
[info] /Users/geri/work/scalding/scalding-core/src/main/java/com/twitter/scalding/cascading_interop/FlowListenerPromise.java: /Users/geri/work/scalding/scalding-core/src/main/java/com/twitter/scalding/cascading_interop/FlowListenerPromise.java uses unchecked or unsafe operations.
[info] /Users/geri/work/scalding/scalding-core/src/main/java/com/twitter/scalding/cascading_interop/FlowListenerPromise.java: Recompile with -Xlint:unchecked for details.
[info] Because /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/TypeDescriptor.scala contains a macro definition, the following dependencies are invalidated unconditionally:
[info] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/bdd/TBddDsl.scala
[info] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/MacroImplicits.scala
[info] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/Macros.scala
[info] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/macros/impl/TypeDescriptorProviderImpl.scala
[info] /Users/geri/work/scalding/scalding-core/src/main/scala/com/twitter/scalding/source/TypedText.scala
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] ArgTest:
[info] Tool.parseArgs
[info] - should handle the empty list
[info] - should accept any number of dashed args
[info] - should remove empty args in lists
[info] - should put initial args into the empty key
[info] - should allow any number of args per key
[info] - should allow any number of dashes
[info] - should round trip to/from string
[info] - should handle positional arguments
[info] - should handle negative numbers in args
[info] - should handle strange characters in the args
[info] - should access positional arguments using apply
[info] - should verify that args belong to an accepted key set
[info] - should correctly parse numeric args
[info] RangeSpecs:
[info] A Range
[info] - should contain its endpoints
[info] - should throw errors for misordered ranges
[info] - should assert lower bounds
[info] - should assert upper bounds
[info] should print nicely with mkString
[info] - should for trivial ranges
[info] - should for proper ranges
[info] Compiling 69 Scala sources to /Users/geri/work/scalding/scalding-core/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/CascadeTest.scala:86: [CloseSourceFile] You should close the file stream after use. (Streams get garbage collected, but it is possible to open too many at once)
[warn] val lines = fromFile(output1.getAbsolutePath).getLines.toList
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/ExecutionTest.scala:360: [UseHeadNotApply] It is idiomatic to use files.head instead of files(0) for List
[warn] assert(files(0).contains(tempFile))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/ExecutionTest.scala:388: [UseHeadNotApply] It is idiomatic to use files.head instead of files(0) for List
[warn] assert(files(0).contains(tempFileOne) || files(0).contains(tempFileTwo))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/ExecutionTest.scala:388: [UseHeadNotApply] It is idiomatic to use files.head instead of files(0) for List
[warn] assert(files(0).contains(tempFileOne) || files(0).contains(tempFileTwo))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/CoreTest.scala:1082: method apply in object JobTest is deprecated: Use the non-reflection based JobTest apply methods
[warn] JobTest("com.twitter.scalding.PivotJob")
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/CumulativeSumTest.scala:64: method apply in object JobTest is deprecated: Use the non-reflection based JobTest apply methods
[warn] JobTest("com.twitter.scalding.AddRankingWithCumulativeSum")
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/CumulativeSumTest.scala:79: method apply in object JobTest is deprecated: Use the non-reflection based JobTest apply methods
[warn] JobTest("com.twitter.scalding.AddRankingWithPartitionedCumulativeSum")
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/PackTest.scala:164: method apply in object JobTest is deprecated: Use the non-reflection based JobTest apply methods
[warn] JobTest("com.twitter.scalding.ContainerToPopulationJob")
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/SideEffectTest.scala:103: method apply in object JobTest is deprecated: Use the non-reflection based JobTest apply methods
[warn] JobTest("com.twitter.scalding.ZipBuffer")
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/TypedPipeTest.scala:543: method apply in object JobTest is deprecated: Use the non-reflection based JobTest apply methods
[warn] JobTest(jobName)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-core/src/test/scala/com/twitter/scalding/mathematics/MatrixTest.scala:495: method apply in object JobTest is deprecated: Use the non-reflection based JobTest apply methods
[warn] JobTest("com.twitter.scalding.mathematics.MatrixSum3")
[warn] ^
[warn] there were 9 feature warnings; re-run with -feature for details
[warn] 12 warnings found
[info] Compiling 2 Scala sources to /Users/geri/work/scalding/scalding-json/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-json/src/main/scala/com/twitter/scalding/JsonLine.scala:59: non-variable type argument String in type pattern scala.collection.immutable.Map[String,AnyRef] (the underlying of Map[String,AnyRef]) is unchecked since it is eliminated by erasure
[warn] case fs: Map[String, AnyRef] => nestedRetrieval(Option(fs), tail)
[warn] ^
[warn] one warning found
[info] Compiling 5 Scala sources to /Users/geri/work/scalding/scalding-jdbc/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-jdbc/src/main/scala/com/twitter/scalding/jdbc/DriverColumnDefiner.scala:37: [TypeToType] Using toString on something that is already of type String.
[warn] val defStr = defOp.map { " DEFAULT '" + _.toString + "' " }.getOrElse(" ")
[warn] ^
[warn] /Users/geri/work/scalding/scalding-jdbc/src/main/scala/com/twitter/scalding/jdbc/JDBCDriver.scala:13: constructor TableDesc in class TableDesc is deprecated: see corresponding Javadoc for more information.
[warn] new TableDesc(tableName.get, columnNames.map(_.get), columnDefinitions.map(_.get), null, null)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-jdbc/src/main/scala/com/twitter/scalding/jdbc/JDBCDriver.scala:36: constructor TableDesc in class TableDesc is deprecated: see corresponding Javadoc for more information.
[warn] new TableDesc(
[warn] ^
[warn] /Users/geri/work/scalding/scalding-jdbc/src/main/scala/com/twitter/scalding/jdbc/JDBCSource.scala:86: method setConcurrentReads in class JDBCTap is deprecated: see corresponding Javadoc for more information.
[warn] tap.setConcurrentReads(maxConcurrentReads)
[warn] ^
[warn] four warnings found
[info] Compiling 18 Scala sources to /Users/geri/work/scalding/scalding-db/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:121: [MergeMaps] Merge these two map operations.
[warn] .map {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:110: [TypeToType] Using toTermName on something that is already of type TermName.
[warn] val fieldName = m.name.toTermName.toString.trim
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/DateTypeHandler.scala:31: [PassPartialFunctionDirectly] You can pass the partial function in directly. (Remove `t => t match {`).
[warn] extracted.flatMap { t =>
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/NumericTypeHandler.scala:32: [PassPartialFunctionDirectly] You can pass the partial function in directly. (Remove `t => t match {`).
[warn] extracted.flatMap { t =>
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/StringTypeHandler.scala:32: [PassPartialFunctionDirectly] You can pass the partial function in directly. (Remove `t => t match {`).
[warn] extracted.flatMap { t =>
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:22: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] private[this] def getDefaultArgs(c: Context)(tpe: c.Type): Map[String, c.Expr[String]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:25: method companionSymbol in trait SymbolApi is deprecated: Use `companion` instead, but beware of possible changes in behavior
[warn] val moduleSym = classSym.companionSymbol
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:31: method declaration in class TypeApi is deprecated: Use `decl` instead
[warn] val applyList = moduleSym.typeSignature.declaration(newTermName("apply")).asTerm.alternatives
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:31: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val applyList = moduleSym.typeSignature.declaration(newTermName("apply")).asTerm.alternatives
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:35: method paramss in trait MethodSymbolApi is deprecated: Use `paramLists` instead
[warn] apply.paramss.head.map(_.asTerm).zipWithIndex.flatMap{
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:39: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val getterName = newTermName("apply$default$" + (i + 1))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:45: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] private[scalding] def getColumnFormats[T](c: Context)(implicit T: c.WeakTypeTag[T]): List[ColumnFormat[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:89: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] outerTpe.declarations.foreach(_.typeSignature)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:94: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:96: method tpe in trait AnnotationApi is deprecated: Use `tree.tpe` instead
[warn] val mappedAnnotations = m.annotations.map(t => (t.tpe, t.scalaArgs))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:96: method scalaArgs in trait AnnotationApi is deprecated: Use `tree.children.tail` instead
[warn] val mappedAnnotations = m.annotations.map(t => (t.tpe, t.scalaArgs))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:107: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:159: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def getColumnDefn[T](c: Context)(implicit T: c.WeakTypeTag[T]): List[c.Expr[ColumnDefinition]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:170: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val fieldTypeSelect = Select(q"_root_.com.twitter.scalding.db", newTermName(cf.fieldType))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:182: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def getExtractor[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[ResultSetExtractor[T]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:187: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val rsmdTerm = newTermName(c.fresh("rsmd"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:187: method fresh in trait Names is deprecated: Use freshName instead
[warn] val rsmdTerm = newTermName(c.fresh("rsmd"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:194: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val typeNameTerm = newTermName(c.fresh(s"colTypeName_$pos"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:194: method fresh in trait Names is deprecated: Use freshName instead
[warn] val typeNameTerm = newTermName(c.fresh(s"colTypeName_$pos"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:225: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val nullableTerm = newTermName(c.fresh(s"isNullable_$pos"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:225: method fresh in trait Names is deprecated: Use freshName instead
[warn] val nullableTerm = newTermName(c.fresh(s"isNullable_$pos"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:240: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val rsTerm = newTermName(c.fresh("rs"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:240: method fresh in trait Names is deprecated: Use freshName instead
[warn] val rsTerm = newTermName(c.fresh("rs"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:264: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val valueTerm = newTermName(c.fresh("colValue"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:264: method fresh in trait Names is deprecated: Use freshName instead
[warn] val valueTerm = newTermName(c.fresh("colValue"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:273: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val tcTerm = newTermName(c.fresh("conv"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:273: method fresh in trait Names is deprecated: Use freshName instead
[warn] val tcTerm = newTermName(c.fresh("conv"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala:285: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[ColumnDefinitionProvider[T]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/DBTypeDescriptorImpl.scala:14: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[DBTypeDescriptor[T]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/JdbcFieldSetter.scala:29: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] override def absent(c: Context)(idx: Int, container: c.TermName): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/JdbcFieldSetter.scala:34: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] override def default(c: Context)(idx: Int, container: c.TermName, fieldValue: c.Tree): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/JdbcFieldSetter.scala:39: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] override def from(c: Context)(fieldType: c.Type, idx: Int, container: c.TermName, fieldValue: c.Tree): Try[c.Tree] = Try {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/JdbcStatementSetterImpl.scala:32: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def caseClassJdbcSetterCommonImpl[T](c: Context,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/JdbcStatementSetterImpl.scala:36: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val stmtTerm = newTermName(c.fresh("stmt"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/JdbcStatementSetterImpl.scala:36: method fresh in trait Names is deprecated: Use freshName instead
[warn] val stmtTerm = newTermName(c.fresh("stmt"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/AnnotationHelper.scala:28: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] val ctx: Context
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/ColumnFormat.scala:9: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(fAccessor: List[c.universe.MethodSymbol], fType: String, size: Option[Int])(implicit fName: FieldName,
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/ColumnFormat.scala:29: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] abstract class ColumnFormat[C <: Context](val ctx: C) {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/DateTypeHandler.scala:13: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply[T](c: Context)(implicit accessorTree: List[c.universe.MethodSymbol],
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/NumericTypeHandler.scala:13: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply[T](c: Context)(implicit accessorTree: List[c.universe.MethodSymbol],
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/StringTypeHandler.scala:12: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply[T](c: Context)(implicit accessorTree: List[c.universe.MethodSymbol],
[warn] ^
[warn] 46 warnings found
[info] Because /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/DBMacro.scala contains a macro definition, the following dependencies are invalidated unconditionally:
[info] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/ColumnDefinitionProviderImpl.scala
[info] /Users/geri/work/scalding/scalding-db/src/main/scala/com/twitter/scalding/db/macros/impl/handler/AnnotationHelper.scala
[info] Compiling 5 Scala sources to /Users/geri/work/scalding/scalding-repl/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-repl/src/main/scala/com/twitter/scalding/ScaldingShell.scala:140: method virtualDirectory in class IMain is deprecated: Use replOutput.dir instead
[warn] val virtualDirectory = repl.virtualDirectory
[warn] ^
[warn] one warning found
[info] Compiling 3 Scala sources to /Users/geri/work/scalding/scalding-hraven/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-hraven/src/main/scala/com/twitter/scalding/hraven/estimation/HRavenHistoryService.scala:200: [MergeMaps] Merge these two map operations.
[warn] step <- history
[warn] ^
[warn] one warning found
[info] Compiling 3 Scala sources to /Users/geri/work/scalding/scalding-avro/target/scala-2.11/classes...
[info] Compiling 7 Scala sources to /Users/geri/work/scalding/scalding-hadoop-test/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/main/scala/com/twitter/scalding/platform/HadoopPlatformExecutionTest.scala:48: a pure expression does nothing in statement position; you may be omitting necessary parentheses
[warn] case Success(s) => s
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/main/scala/com/twitter/scalding/platform/HadoopPlatformJobTest.scala:66: non-variable type argument org.apache.hadoop.mapred.JobConf in type pattern cascading.flow.Flow[org.apache.hadoop.mapred.JobConf] is unchecked since it is eliminated by erasure
[warn] case f: Flow[JobConf] => checker(f)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/main/scala/com/twitter/scalding/platform/HadoopPlatformExecutionTest.scala:36: [UndesirableTypeInference] Inferred type com.twitter.scalding.Execution[Any]. (This might not be what you've intended)
[warn] val execution = init(cons)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/main/scala/com/twitter/scalding/platform/HadoopPlatformJobTest.scala:77: [UseOptionForeachNotPatMatch] ... match { Some(x) => HadoopPlatformJobTest.this.execute(nextJob); None => {} } can be replaced with .foreach(HadoopPlatformJobTest.this.execute(nextJob))
[warn] job.next match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/main/scala/com/twitter/scalding/platform/MakeJar.scala:69: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => if (parent.==(src))
[warn] result.foldLeft[Option[java.io.File]]((scala.None: Option[java.io.File]))(((cum: Option[java.io.File], part: String) => scala.Some.apply[java.io.File](cum match {
[warn] case (x: java.io.File)Some[java.io.File]((p @ _)) => new java.io.File(p, part)
[warn] case scala.None => new java.io.File(part)
[warn] })))
[warn] else
[warn] MakeJar.this.getRelativeFileBetween(parent, src.getParentFile(), {
[warn] <synthetic> <artifact> val x$2: String = src.getName();
[warn] result.::[String](x$2)
[warn] }); None => None} can be replaced with .flatMap(if (parent.==(src))
[warn] result.foldLeft[Option[java.io.File]]((scala.None: Option[java.io.File]))(((cum: Option[java.io.File], part: String) => scala.Some.apply[java.io.File](cum match {
[warn] case (x: java.io.File)Some[java.io.File]((p @ _)) => new java.io.File(p, part)
[warn] case scala.None => new java.io.File(part)
[warn] })))
[warn] else
[warn] MakeJar.this.getRelativeFileBetween(parent, src.getParentFile(), {
[warn] <synthetic> <artifact> val x$2: String = src.getName();
[warn] result.::[String](x$2)
[warn] }))
[warn] Option(source) match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/main/scala/com/twitter/scalding/platform/LocalCluster.scala:46: class MiniMRCluster in package mapred is deprecated: see corresponding Javadoc for more information.
[warn] private var hadoop: Option[(MiniDFSCluster, MiniMRCluster, JobConf)] = None
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/main/scala/com/twitter/scalding/platform/LocalCluster.scala:90: constructor MiniDFSCluster in class MiniDFSCluster is deprecated: see corresponding Javadoc for more information.
[warn] val dfs = new MiniDFSCluster(conf, 4, true, null)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/main/scala/com/twitter/scalding/platform/LocalCluster.scala:92: class MiniMRCluster in package mapred is deprecated: see corresponding Javadoc for more information.
[warn] val cluster = new MiniMRCluster(4, fileSystem.getUri.toString, 1, null, null, new JobConf(conf))
[warn] ^
[warn] 8 warnings found
[info] Compiling 1 Scala source to /Users/geri/work/scalding/tutorial/execution-tutorial/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/tutorial/execution-tutorial/ExecutionTutorial.scala:51: [PassPartialFunctionDirectly] You can pass the partial function in directly. (Remove `t => t match {`).
[warn] .onComplete { t => t match {
[warn] ^
[warn] one warning found
[info] Compiling 8 Scala sources to /Users/geri/work/scalding/scalding-thrift-macros/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:64: [TypeToType] Using toTermName on something that is already of type TermName.
[warn] .filter(m => fieldNames.contains(m.name.toTermName.toString.toLowerCase))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:68: [TypeToType] Using toTermName on something that is already of type TermName.
[warn] (fieldType, accessorMethod.name.toTermName, b)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:94: [MergeMaps] Merge these two map operations.
[warn] q"""
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeUnionOrderedBuf.scala:51: [TypeToType] Using toList on something that is already of type List.
[warn] }.zipWithIndex.map{ case ((tpe, tbuf), idx) => (idx, tpe, tbuf) }.toList
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeUnionOrderedBuf.scala:53: [InefficientUseOfListSize] Use subData.nonEmpty instead of comparing to subData.size. (subData is a List, size takes O(n) time)
[warn] require(subData.size > 0, "Must have some sub types on a union?")
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ScroogeInternalOrderedSerializationImpl.scala:37: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] private def baseScroogeDispatcher(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ScroogeInternalOrderedSerializationImpl.scala:51: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] private def innerDispatcher(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ScroogeInternalOrderedSerializationImpl.scala:63: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] private def outerDispatcher(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ScroogeInternalOrderedSerializationImpl.scala:74: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply[T](c: Context)(implicit T: c.WeakTypeTag[T]): c.Expr[OrderedSerialization[T]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeEnumOrderedBuf.scala:25: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeEnumOrderedBuf.scala:34: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeEnumOrderedBuf.scala:37: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"fresh_$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeEnumOrderedBuf.scala:37: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"fresh_$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeEnumOrderedBuf.scala:53: method companionSymbol in trait SymbolApi is deprecated: Use `companion` instead, but beware of possible changes in behavior
[warn] q"${outerType.typeSymbol.companionSymbol}.apply($inputStream.readPosVarInt)"
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:32: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:41: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]], outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:43: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:43: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:47: method companionSymbol in trait SymbolApi is deprecated: Use `companion` instead, but beware of possible changes in behavior
[warn] val companionSymbol = outerType.typeSymbol.companionSymbol
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:50: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:51: method decoded in class NameApi is deprecated: Use `decodedName.toString` instead
[warn] .filter(_.name.decoded.endsWith("Field "))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:56: method decoded in class NameApi is deprecated: Use `decodedName.toString` instead
[warn] val decodedName = t.name.decoded // Looks like "MethodNameField "
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOrderedBuf.scala:62: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOuterOrderedBuf.scala:32: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOuterOrderedBuf.scala:41: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOuterOrderedBuf.scala:43: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOuterOrderedBuf.scala:43: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeOuterOrderedBuf.scala:47: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val variableName = newTermName(variableNameStr)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeUnionOrderedBuf.scala:26: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def dispatch(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]]): PartialFunction[c.Type, TreeOrderedBuf[c.type]] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeUnionOrderedBuf.scala:37: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def apply(c: Context)(buildDispatcher: => PartialFunction[c.Type, TreeOrderedBuf[c.type]], outerType: c.Type): TreeOrderedBuf[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeUnionOrderedBuf.scala:39: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/ScroogeUnionOrderedBuf.scala:39: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(s"$id"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:29: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def compareBinary(c: Context)(inputStreamA: c.TermName, inputStreamB: c.TermName)(subData: List[(Int, c.Type, Option[TreeOrderedBuf[c.type]])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:31: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:31: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:76: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def hash(c: Context)(element: c.TermName)(subData: List[(Int, c.Type, Option[TreeOrderedBuf[c.type]])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:78: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:78: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:114: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def put(c: Context)(inputStream: c.TermName, element: c.TermName)(subData: List[(Int, c.Type, Option[TreeOrderedBuf[c.type]])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:116: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:116: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:150: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def length(c: Context)(element: c.Tree)(subData: List[(Int, c.Type, Option[TreeOrderedBuf[c.type]])]): CompileTimeLengthTypes[c.type] = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:153: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:153: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:209: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def get(c: Context)(inputStream: c.TermName)(subData: List[(Int, c.Type, Option[TreeOrderedBuf[c.type]])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:211: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:211: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:253: type Context in package macros is deprecated: Use blackbox.Context or whitebox.Context instead
[warn] def compare(c: Context)(cmpType: c.Type, elementA: c.TermName, elementB: c.TermName)(subData: List[(Int, c.Type, Option[TreeOrderedBuf[c.type]])]): c.Tree = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:256: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/main/scala/com/twitter/scalding/thrift/macros/impl/ordered_serialization/UnionLike.scala:256: method fresh in trait Names is deprecated: Use freshName instead
[warn] def freshT(id: String) = newTermName(c.fresh(id))
[warn] ^
[warn] 50 warnings found
[info] Compiling 21 Scala sources and 4 Java sources to /Users/geri/work/scalding/scalding-commons/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/DailySources.scala:32: no valid targets for annotation on value suppliedInjection - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] abstract class DailySuffixLzoCodec[T](prefix: String, dateRange: DateRange)(implicit @transient suppliedInjection: Injection[T, Array[Byte]])
[warn] ^
[warn] /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/HourlySources.scala:27: no valid targets for annotation on value suppliedInjection - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] abstract class HourlySuffixLzoCodec[T](prefix: String, dateRange: DateRange)(implicit @transient suppliedInjection: Injection[T, Array[Byte]])
[warn] ^
[warn] /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/LzoGenericScheme.scala:118: no valid targets for annotation on value conv - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] class LzoGenericScheme[M](@transient conv: BinaryConverter[M], clazz: Class[M]) extends LzoBinaryScheme[M, GenericWritable[M]] {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/VersionedKeyValSource.scala:57: no valid targets for annotation on value codec - it is discarded unused. You may specify targets with meta-annotations, e.g. @(transient @param)
[warn] implicit @transient codec: Injection[(K, V), (Array[Byte], Array[Byte])]) extends Source
[warn] ^
[warn] /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/LzoGenericScheme.scala:82: [UndesirableTypeInference] Inferred type com.twitter.chill.Externalizer[Nothing]. (This might not be what you've intended)
[warn] val extern = ExternalizerSerializer.inj.invert(data).get
[warn] ^
[warn] /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/VersionedKeyValSource.scala:130: [UseOptionExistsNotPatMatch] ... match { Some(x) => false; None => mode match {
[warn] case (buffers: com.twitter.scalding.Source => Option[scala.collection.mutable.Buffer[cascading.tuple.Tuple]])com.twitter.scalding.Test((buffers @ _)) => buffers.apply(this).map[Boolean](((x$3: scala.collection.mutable.Buffer[cascading.tuple.Tuple]) => x$3.isEmpty.unary_!)).getOrElse[Boolean](false)
[warn] case (conf: org.apache.hadoop.conf.Configuration, buffers: com.twitter.scalding.Source => Option[scala.collection.mutable.Buffer[cascading.tuple.Tuple]])com.twitter.scalding.HadoopTest((conf @ _), (buffers @ _)) => buffers.apply(this).map[Boolean](((x$4: scala.collection.mutable.Buffer[cascading.tuple.Tuple]) => x$4.isEmpty.unary_!)).getOrElse[Boolean](false)
[warn] case (m @ (_: com.twitter.scalding.HadoopMode)) => {
[warn] val conf: org.apache.hadoop.mapred.JobConf = new org.apache.hadoop.mapred.JobConf(m.jobConf);
[warn] val store: com.twitter.scalding.commons.datastores.VersionedStore = VersionedKeyValSource.this.sink.getStore(conf);
[warn] store.hasVersion(version)
[warn] }
[warn] case _ => scala.sys.`package`.error(scala.StringContext.apply("Unknown mode ", "").s(mode))
[warn] }} can be replaced with .exists(mode match {
[warn] case (buffers: com.twitter.scalding.Source => Option[scala.collection.mutable.Buffer[cascading.tuple.Tuple]])com.twitter.scalding.Test((buffers @ _)) => buffers.apply(this).map[Boolean](((x$3: scala.collection.mutable.Buffer[cascading.tuple.Tuple]) => x$3.isEmpty.unary_!)).getOrElse[Boolean](false)
[warn] case (conf: org.apache.hadoop.conf.Configuration, buffers: com.twitter.scalding.Source => Option[scala.collection.mutable.Buffer[cascading.tuple.Tuple]])com.twitter.scalding.HadoopTest((conf @ _), (buffers @ _)) => buffers.apply(this).map[Boolean](((x$4: scala.collection.mutable.Buffer[cascading.tuple.Tuple]) => x$4.isEmpty.unary_!)).getOrElse[Boolean](false)
[warn] case (m @ (_: com.twitter.scalding.HadoopMode)) => {
[warn] val conf: org.apache.hadoop.mapred.JobConf = new org.apache.hadoop.mapred.JobConf(m.jobConf);
[warn] val store: com.twitter.scalding.commons.datastores.VersionedStore = VersionedKeyValSource.this.sink.getStore(conf);
[warn] store.hasVersion(version)
[warn] }
[warn] case _ => scala.sys.`package`.error(scala.StringContext.apply("Unknown mode ", "").s(mode))
[warn] })
[warn] sinkVersion match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-commons/src/main/scala/com/twitter/scalding/commons/source/DailySources.scala:45: method erasure in trait ClassManifestDeprecatedApis is deprecated: Use runtimeClass instead
[warn] override def column = manifest[T].erasure
[warn] ^
[warn] 7 warnings found
[warn] bootstrap class path not set in conjunction with -source 1.6
[info] /Users/geri/work/scalding/scalding-commons/src/main/java/com/twitter/scalding/commons/datastores/VersionedStore.java: /Users/geri/work/scalding/scalding-commons/src/main/java/com/twitter/scalding/commons/datastores/VersionedStore.java uses or overrides a deprecated API.
[info] /Users/geri/work/scalding/scalding-commons/src/main/java/com/twitter/scalding/commons/datastores/VersionedStore.java: Recompile with -Xlint:deprecation for details.
[info] /Users/geri/work/scalding/scalding-commons/src/main/java/com/twitter/scalding/commons/scheme/KeyValueByteScheme.java: /Users/geri/work/scalding/scalding-commons/src/main/java/com/twitter/scalding/commons/scheme/KeyValueByteScheme.java uses unchecked or unsafe operations.
[info] /Users/geri/work/scalding/scalding-commons/src/main/java/com/twitter/scalding/commons/scheme/KeyValueByteScheme.java: Recompile with -Xlint:unchecked for details.
[info] Compiling 13 Scala sources and 8 Java sources to /Users/geri/work/scalding/scalding-parquet/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetSchemaProvider.scala:63: [TypeToType] Using toTermName on something that is already of type TermName.
[warn] val fieldName = accessorMethod.name.toTermName.toString
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/scheme/TypedParquetTupleScheme.scala:61: [UndesirableTypeInference] Inferred type scala.util.Try[Any]. (This might not be what you've intended)
[warn] val readSupportInstance = ParquetInputOutputFormat.injection.invert(readSupport)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/scheme/TypedParquetTupleScheme.scala:114: [UndesirableTypeInference] Inferred type scala.util.Try[Any]. (This might not be what you've intended)
[warn] val writeSupportInstance = ParquetInputOutputFormat.injection.invert(writeSupport)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:109: [EmptyStringInterpolator] This string interpolation has no arguments.
[warn] val converterName = newTermName(ctx.fresh(s"fieldConverter"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:116: [EmptyStringInterpolator] This string interpolation has no arguments.
[warn] val converterName = newTermName(ctx.fresh(s"fieldConverter"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:122: [EmptyStringInterpolator] This string interpolation has no arguments.
[warn] val converterName = newTermName(ctx.fresh(s"fieldConverter"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/WriteSupportProvider.scala:57: [EmptyStringInterpolator] This string interpolation has no arguments.
[warn] val cacheName = newTermName(ctx.fresh(s"optionIndex"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/HasColumnProjection.scala:42: method withColumns in trait HasColumnProjection is deprecated: Use withColumnProjections, which uses a different glob syntax
[warn] val deprecated = withColumns
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/thrift/Parquet346TBaseScheme.scala:44: method setRecordConverterClass in object ThriftReadSupport is deprecated: see corresponding Javadoc for more information.
[warn] ThriftReadSupport.setRecordConverterClass(jobConf, classOf[Parquet346TBaseRecordConverter[_]])
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:109: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val converterName = newTermName(ctx.fresh(s"fieldConverter"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:109: method fresh in trait Names is deprecated: Use freshName instead
[warn] val converterName = newTermName(ctx.fresh(s"fieldConverter"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:116: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val converterName = newTermName(ctx.fresh(s"fieldConverter"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:116: method fresh in trait Names is deprecated: Use freshName instead
[warn] val converterName = newTermName(ctx.fresh(s"fieldConverter"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:122: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val converterName = newTermName(ctx.fresh(s"fieldConverter"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:122: method fresh in trait Names is deprecated: Use freshName instead
[warn] val converterName = newTermName(ctx.fresh(s"fieldConverter"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:170: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetReadSupportProvider.scala:190: method companionSymbol in trait SymbolApi is deprecated: Use `companion` instead, but beware of possible changes in behavior
[warn] val companion = tpe.typeSymbol.companionSymbol
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/ParquetSchemaProvider.scala:60: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/WriteSupportProvider.scala:57: method newTermName in trait Names is deprecated: Use TermName instead
[warn] val cacheName = newTermName(ctx.fresh(s"optionIndex"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/WriteSupportProvider.scala:57: method fresh in trait Names is deprecated: Use freshName instead
[warn] val cacheName = newTermName(ctx.fresh(s"optionIndex"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/WriteSupportProvider.scala:105: method declarations in class TypeApi is deprecated: Use `decls` instead
[warn] .declarations
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/WriteSupportProvider.scala:117: method newTermName in trait Names is deprecated: Use TermName instead
[warn] def createGroupName(): TermName = newTermName(ctx.fresh("group"))
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/main/scala/com/twitter/scalding/parquet/tuple/macros/impl/WriteSupportProvider.scala:117: method fresh in trait Names is deprecated: Use freshName instead
[warn] def createGroupName(): TermName = newTermName(ctx.fresh("group"))
[warn] ^
[warn] 23 warnings found
[warn] bootstrap class path not set in conjunction with -source 1.6
[info] /Users/geri/work/scalding/scalding-parquet/src/main/java/com/twitter/scalding/parquet/thrift/ParquetTBaseScheme.java: Some input files use or override a deprecated API.
[info] /Users/geri/work/scalding/scalding-parquet/src/main/java/com/twitter/scalding/parquet/thrift/ParquetTBaseScheme.java: Recompile with -Xlint:deprecation for details.
[info] /Users/geri/work/scalding/scalding-parquet/src/main/java/com/twitter/scalding/parquet/tuple/ParquetTupleScheme.java: /Users/geri/work/scalding/scalding-parquet/src/main/java/com/twitter/scalding/parquet/tuple/ParquetTupleScheme.java uses unchecked or unsafe operations.
[info] /Users/geri/work/scalding/scalding-parquet/src/main/java/com/twitter/scalding/parquet/tuple/ParquetTupleScheme.java: Recompile with -Xlint:unchecked for details.
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] + JavaStreamEnrichmentsProperties.Can (read/write)Short: OK, passed 100 tests.
[info] + JavaStreamEnrichmentsProperties.Can (read/write)Boolean: OK, passed 100 tests.
[info] + JavaStreamEnrichmentsProperties.Can (read/write)UnsignedByte: OK, passed 100 tests.
[info] + JavaStreamEnrichmentsProperties.Can (read/write)Size: OK, passed 100 tests.
[info] + JavaStreamEnrichmentsProperties.Can (read/write)Array[Byte]: OK, passed 100 tests.
[info] + JavaStreamEnrichmentsProperties.Can (read/write)Long: OK, passed 100 tests.
[info] + JavaStreamEnrichmentsProperties.Can (read/write)Int: OK, passed 100 tests.
[info] + JavaStreamEnrichmentsProperties.Can (read/write)UnsignedShort: OK, passed 100 tests.
[info] + JavaStreamEnrichmentsProperties.Can (read/write)Double: OK, passed 100 tests.
[info] + JavaStreamEnrichmentsProperties.Can (read/write)Float: OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.equiv(a, b) == (write(a) == write(b)): OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.compare(a, b) == compareBinary(aBin, bBin): OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.staticSize.orElse(dynamicSize(t)).map { _ == toBytes(t).length }: OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.staticSize.orElse(dynamicSize(t)).map { _ == toBytes(t).length }: OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.roundTrip: OK, passed 100 tests.
[info] + SerializationProperties.sequences equiv well [String]: OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.equiv(a, a) == true: OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.equiv(a, b) => hash(a) == hash(b): OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.staticSize.orElse(dynamicSize(t)).map { _ == toBytes(t).length }: OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.equiv(a, b) == (write(a) == write(b)): OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.equiv(a, a) == true: OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.equiv(a, a) == true: OK, passed 100 tests.
[info] + SerializationProperties.sequences equiv well [IntTryWrapperClass]: OK, passed 100 tests.
[info] + SerializationProperties.sequences equiv well [(Int, Int)]: OK, passed 100 tests.
[info] + SerializationProperties.sequences equiv well [IntWrapperClass]: OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.staticSize.orElse(dynamicSize(t)).map { _ == toBytes(t).length }: OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.transitivity: OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.equiv(a, b) => hash(a) == hash(b): OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.transitivity: OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.equiv(a, b) && equiv(b, c) => equiv(a, c): OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.equiv(a, a) == true: OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.roundTrip: OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.roundTrip: OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.totality: OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.equiv(a, b) == (write(a) == write(b)): OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.transitivity: OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.compare(a, b) == compareBinary(aBin, bBin): OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.totality: OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.equiv(a, b) => hash(a) == hash(b): OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.equiv(a, b) && equiv(b, c) => equiv(a, c): OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.equiv(a, b) && equiv(b, c) => equiv(a, c): OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.antisymmetry: OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.equiv(a, b) => hash(a) == hash(b): OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.equiv(a, b) && equiv(b, c) => equiv(a, c): OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.antisymmetry: OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.equiv(a, b) && equiv(b, c) => equiv(a, c): OK, passed 100 tests.
[info] + SerializationProperties.sequences equiv well [(String, String)]: OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.equiv(a, b) => hash(a) == hash(b): OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.staticSize.orElse(dynamicSize(t)).map { _ == toBytes(t).length }: OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.equiv(a, b) && equiv(b, c) => equiv(a, c): OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.totality: OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.antisymmetry: OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.equiv(a, b) == (write(a) == write(b)): OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.roundTrip: OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.totality: OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.compare(a, b) == compareBinary(aBin, bBin): OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.staticSize.orElse(dynamicSize(t)).map { _ == toBytes(t).length }: OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.compare(a, b) == compareBinary(aBin, bBin): OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.antisymmetry: OK, passed 100 tests.
[info] + SerializationProperties.sequences compare well [(String, String)]: OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.transitivity: OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.equiv(a, b) => hash(a) == hash(b): OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.compare(a, b) == compareBinary(aBin, bBin): OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.roundTrip: OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.equiv(a, b) && equiv(b, c) => equiv(a, c): OK, passed 100 tests.
[info] + SerializationProperties.sequences compare well [String]: OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.antisymmetry: OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.antisymmetry: OK, passed 100 tests.
[info] + SerializationProperties.sequences compare well [IntWrapperClass]: OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.equiv(a, a) == true: OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.transitivity: OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.transitivity: OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.equiv(a, b) => hash(a) == hash(b): OK, passed 100 tests.
[info] + SerializationProperties.sequences compare well [(Int, Int)]: OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.equiv(a, b) == (write(a) == write(b)): OK, passed 100 tests.
[info] + SerializationProperties.sequences compare well [Int]: OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.staticSize.orElse(dynamicSize(t)).map { _ == toBytes(t).length }: OK, passed 100 tests.
[info] + SerializationProperties.sequences compare well [IntTryWrapperClass]: OK, passed 100 tests.
[info] + SerializationProperties.sequences equiv well [Int]: OK, passed 100 tests.
[info] + SerializationProperties.IntWrapperClass Ordered.transitivity: OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.compare(a, b) == compareBinary(aBin, bBin): OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.staticSize.orElse(dynamicSize(t)).map { _ == toBytes(t).length }: OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.totality: OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.equiv(a, b) && equiv(b, c) => equiv(a, c): OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.equiv(a, b) == (write(a) == write(b)): OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.equiv(a, a) == true: OK, passed 100 tests.
[info] + SerializationProperties.String Ordered.equiv(a, b) == (write(a) == write(b)): OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.equiv(a, b) => hash(a) == hash(b): OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.roundTrip: OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.antisymmetry: OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.roundTrip: OK, passed 100 tests.
[info] + SerializationProperties.(Int, Int) Ordered.equiv(a, b) == (write(a) == write(b)): OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.totality: OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.equiv(a, a) == true: OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.transitivity: OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.equiv(a, a) == true: OK, passed 100 tests.
[info] + SerializationProperties.Int Ordered.compare(a, b) == compareBinary(aBin, bBin): OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.roundTrip: OK, passed 100 tests.
[info] + SerializationProperties.(String, String) Ordered.totality: OK, passed 100 tests.
[info] + SerializationProperties.(Int, String) Ordered.compare(a, b) == compareBinary(aBin, bBin): OK, passed 100 tests.
[info] + SerializationProperties.IntTryWrapperClass Ordered.totality: OK, passed 100 tests.
[info] + SerializationProperties.(String, Int) Ordered.antisymmetry: OK, passed 100 tests.
[info] + UnsignedComparisonLaws.UnsignedLongCompare works: OK, passed 100 tests.
[info] + UnsignedComparisonLaws.UnsignedIntCompare works: OK, passed 100 tests.
[info] + UnsignedComparisonLaws.UnsignedByteCompare works: OK, passed 100 tests.
[info] + WriterReaderProperties.Unit Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Long Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Array[Int] Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.(Int, Array[String]) Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Map[Long, Byte] Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Byte Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Option[(Int, Double)] Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Option[Option[Unit]] Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Array[String] Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Double Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Int Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.String Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Boolean Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.List[String] Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Either[Int, String] Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Array[Byte] Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Short Writer/Reader: OK, passed 100 tests.
[info] + WriterReaderProperties.Float Writer/Reader: OK, passed 100 tests.
[info] TraversableHelperLaws:
[info] - Iterator ordering should be Iterable ordering
[info] - Iterator equiv should be Iterable ordering
[info] - sortedCompare matches sort followed by compare List[Int]
[info] - sortedCompare matches sort followed by compare Set[Int]
[info] MacroOrderingProperties:
[info] - Test out Unit
[info] - Test out Boolean
[info] - Test out jl.Boolean
[info] - Test out Byte
[info] - Test out jl.Byte
[info] - Test out Short
[info] - Test out jl.Short
[info] - Test out Char
[info] - Test out jl.Char
[info] - Test out Int
[info] - Test out AnyVal of String
[info] - Test out Tuple of AnyVal's of String
[info] - Test out Tuple of TestSealedAbstractClass
[info] - Test out jl.Integer
[info] - Test out Float
[info] - Test out jl.Float
[info] - Test out Long
[info] - Test out jl.Long
[info] - Test out Double
[info] - Test out jl.Double
[info] - Test out String
[info] - Test out ByteBuffer
[info] - Test out List[Float]
[info] - Test out Queue[Int]
[info] - Test out IndexedSeq[Int]
[info] - Test out HashSet[Int]
[info] - Test out ListSet[Int]
[info] - Test out List[String]
[info] - Test out List[List[String]]
[info] - Test out List[Int]
[info] - Test out SetAlias
[info] - Container.InnerCaseClass
[info] - Test out Seq[Int]
[info] - Test out scala.collection.Seq[Int]
[info] - Test out Array[Byte]
[info] - Test out Vector[Int]
[info] - Test out Iterable[Int]
[info] - Test out Set[Int]
[info] - Test out Set[Double]
[info] - Test out Map[Long, Set[Int]]
[info] - Test out Map[Long, Long]
[info] - Test out HashMap[Long, Long]
[info] - Test out ListMap[Long, Long]
[info] - Test out comparing Maps(3->2, 2->3) and Maps(2->3, 3->2)
[info] - Test out comparing Set("asdf", "jkl") and Set("jkl", "asdf")
[info] - Test known hard String Case
[info] - Test out Option[Int]
[info] - Test out Option[String]
[info] - Test Either[Int, Option[Int]]
[info] - Test Either[Int, String]
[info] - Test Either[Int, Int]
[info] - Test Either[String, Int]
[info] - Test Either[String, String]
[info] - Test out Option[Option[Int]]
[info] - test product like TestCC
[info] - test specific tuple aa1
[info] - test specific tuple 2
[info] - test specific tuple 3
[info] - Test out TestCC
[info] - Test out Sealed Trait
[info] - Test out CaseObject
[info] - Test out (Int, Int)
[info] - Test out (String, Option[Int], String)
[info] - Test out MyData
[info] - Test out MacroOpaqueContainer
[info] - Test out MacroOpaqueContainer inside a case class as an abstract type
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] + Date Properties.Globifying produces matching patterns: OK, passed 100 tests.
[info] + Date Properties.DateRange.length is correct: OK, passed 100 tests.
[info] + Date Properties.RichDate subtraction Roundtrip: OK, passed 100 tests.
[info] + Date Properties.Before/After works: OK, passed 100 tests.
[info] + Date Properties.DateRange.exclusiveUpper works: OK, passed 100 tests.
[info] + Date Properties.Embiggen/extend always contains: OK, passed 100 tests.
[info] + Date Properties.fromMillisecs toMillisecs: OK, passed 100 tests.
[info] + Date Properties.AbsoluteDuration group properties: OK, passed 100 tests.
[info] + Date Properties.Arithmetic works as expected: OK, passed 100 tests.
[info] + Date Properties.each output is contained: OK, passed 100 tests.
[info] + Date Properties.Millisecs rt: OK, passed 100 tests.
[info] + Date Properties.Shifting DateRanges breaks containment: OK, passed 100 tests.
[info] + Globifier Properties.HR Globifier with Year deltas RT's: OK, passed 100 tests.
[info] + Globifier Properties.Day Globifier with hour deltas RT's: OK, passed 100 tests.
[info] + Globifier Properties.Day Globifier with Year deltas RT's: OK, passed 100 tests.
[info] + Globifier Properties.HR Globifier with Day deltas RT's: OK, passed 100 tests.
[info] + Globifier Properties.Day Globifier with Day deltas RT's: OK, passed 100 tests.
[info] + Globifier Properties.HR Globifier with hour deltas RT's: OK, passed 100 tests.
[info] DateTest:
[info] A RichDate
[info] - should implicitly convert strings
[info] - should implicitly convert calendars
[info] - should deal with strings with spaces
[info] - should handle dates with slashes and underscores
[info] - should be able to parse milliseconds
[info] - should throw an exception when trying to parse illegal strings
[info] - should be able to deal with arithmetic operations with whitespace
[info] - should be able to deal with arithmetic operations without hyphens and whitespaces
[info] - should Have same equals & hashCode as Date (crazy?)
[info] - should be well ordered
[info] - should be able to compare with before() and after() with TimeZone in context
[info] - should implicitly convert from long
[info] - should roundtrip successfully
[info] - should know the most recent time units
[info] - should correctly do arithmetic
[info] - should correctly calculate upperBound
[info] - should Have an implicit Ordering
[info] A DateRange
[info] - should correctly iterate on each duration
[info] - should have each partition disjoint and adjacent
[info] - should reject an end that is before its start
[info] Time units
[info] - should have 1000 milliseconds in a sec
[info] - should have 60 seconds in a minute
[info] - should have 60 minutes in a hour
[info] - should have 7 days in a week
[info] AbsoluteDurations
[info] - should behave as comparable
[info] - should add properly
[info] - should have a well behaved max function
[info] Globifiers
[info] - should handle specific hand crafted examples
[info] - should The forward and reverser should match
[info] - should handle random test cases
[info] CalendarOpsTest:
[info] The CalendarOps truncate method
[info] - should not truncate if the specified field is milliseconds
[info] - should truncate to a year
[info] - should truncate to a month
[info] - should truncate to a date
[info] - should truncate to a minute
[info] - should truncate to a second
[info] - should truncate to AM
[info] - should truncate to PM
[info] - should truncate respects DST
[info] ScalaTest
[info] Run completed in 2 minutes, 9 seconds.
[info] Total number of tests run: 19
[info] Suites: completed 2, aborted 0
[info] Tests: succeeded 19, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 19, Failed 0, Errors 0, Passed 19
[info] ScalaTest
[info] Run completed in 2 minutes, 9 seconds.
[info] Total number of tests run: 0
[info] Suites: completed 0, aborted 0
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
[info] No tests were executed.
[info] ScalaTest
[info] Run completed in 2 minutes, 9 seconds.
[info] Total number of tests run: 70
[info] Suites: completed 2, aborted 0
[info] Tests: succeeded 70, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 74, Failed 0, Errors 0, Passed 74
[info] ScalaTest
[info] Run completed in 2 minutes, 9 seconds.
[info] Total number of tests run: 39
[info] Suites: completed 2, aborted 0
[info] Tests: succeeded 39, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 41, Failed 0, Errors 0, Passed 41
[info] Compiling 1 Scala source to /Users/geri/work/scalding/scalding-json/target/scala-2.11/test-classes...
[info] Compiling 1 Scala source to /Users/geri/work/scalding/scalding-jdbc/target/scala-2.11/test-classes...
[info] Compiling 1 Scala source to /Users/geri/work/scalding/scalding-db/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:51: [OptionOfOption] Why would you need an Option of an Option?
[warn] case class BadUser8(age: Option[Option[Int]])
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:52: [OptionOfOption] Why would you need an Option of an Option?
[warn] case class BadUser9(@size(15)@text age: Option[Option[Int]])
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:53: [OptionOfOption] Why would you need an Option of an Option?
[warn] case class BadUser10(@size(2)@size(4) age: Option[Option[Int]])
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:155: [TypeToType] Using toString on something that is already of type String.
[warn] isColumnDefinitionAvailable[User]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:164: [TypeToType] Using toString on something that is already of type String.
[warn] val typeDesc = DBMacro.toDBTypeDescriptor[User]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:166: [TypeToType] Using toList on something that is already of type List.
[warn] assert(columnDef.columns.toList === expectedColumns)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:195: [TypeToType] Using toString on something that is already of type String.
[warn] isColumnDefinitionAvailable[User2]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:203: [TypeToType] Using toString on something that is already of type String.
[warn] val typeDesc = DBMacro.toDBTypeDescriptor[User2]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:205: [TypeToType] Using toList on something that is already of type List.
[warn] assert(columnDef.columns.toList === expectedColumns)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:222: [TypeToType] Using toString on something that is already of type String.
[warn] DBMacro.toDBTypeDescriptor[User]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:224: [TypeToType] Using toString on something that is already of type String.
[warn] isJDBCTypeInfoAvailable[User]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:232: [TypeToType] Using toList on something that is already of type List.
[warn] assert(DBMacro.toDBTypeDescriptor[User].columnDefn.columns.toList === expectedColumns)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:232: [TypeToType] Using toString on something that is already of type String.
[warn] assert(DBMacro.toDBTypeDescriptor[User].columnDefn.columns.toList === expectedColumns)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:237: [YodaConditions] Yoda conditions using you are.
[warn] val typeDescriptor = DBMacro.toDBTypeDescriptor[VerticaCaseClass]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:242: [TypeToType] Using toList on something that is already of type List.
[warn] assert(typeDescriptor.columnDefn.columns.toList === expectedColumns)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:274: [YodaConditions] Yoda conditions using you are.
[warn] isColumnDefinitionAvailable[ExhaustiveJdbcCaseClass]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:277: [YodaConditions] Yoda conditions using you are.
[warn] DBMacro.toDBTypeDescriptor[ExhaustiveJdbcCaseClass]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:279: [YodaConditions] Yoda conditions using you are.
[warn] isJDBCTypeInfoAvailable[ExhaustiveJdbcCaseClass]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:297: [YodaConditions] Yoda conditions using you are.
[warn] val typeDesc = DBMacro.toDBTypeDescriptor[ExhaustiveJdbcCaseClass]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:299: [TypeToType] Using toList on something that is already of type List.
[warn] assert(columnDef.columns.toList === expectedColumns)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:369: [YodaConditions] Yoda conditions using you are.
[warn] val typeDesc = DBMacro.toDBTypeDescriptor[CaseClassWithDate]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:383: [YodaConditions] Yoda conditions using you are.
[warn] val typeDesc = DBMacro.toDBTypeDescriptor[CaseClassWithOptions]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:407: [YodaConditions] Yoda conditions using you are.
[warn] val typeDesc = DBMacro.toDBTypeDescriptor[CaseClassWithOptions]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:432: [YodaConditions] Yoda conditions using you are.
[warn] val typeDesc = DBMacro.toDBTypeDescriptor[CaseClassWithOptions]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:456: [YodaConditions] Yoda conditions using you are.
[warn] val typeDesc = DBMacro.toDBTypeDescriptor[CaseClassWithOptions]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:479: [YodaConditions] Yoda conditions using you are.
[warn] val typeDesc = DBMacro.toDBTypeDescriptor[CaseClassWithOptions]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-db/src/test/scala/com/twitter/scalding/db/macros/MacrosUnitTests.scala:100: type MockitoSugar in package mock is deprecated: Please use org.scalatest.mockito.MockitoSugar instead
[warn] class JdbcMacroUnitTests extends WordSpec with Matchers with MockitoSugar {
[warn] ^
[warn] 27 warnings found
[info] Compiling 1 Scala source to /Users/geri/work/scalding/scalding-repl/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-repl/src/test/scala/com/twitter/scalding/ReplTest.scala:61: The outer reference in this type test cannot be checked at run time.
[warn] case TypedPipe.IterablePipe(_) => succeed
[warn] ^
[warn] /Users/geri/work/scalding/scalding-repl/src/test/scala/com/twitter/scalding/ReplTest.scala:62: The outer reference in this type test cannot be checked at run time.
[warn] case TypedPipe.SourcePipe(s) => assert(s.toString.contains("SequenceFile"))
[warn] ^
[warn] two warnings found
[info] Compiling 1 Scala source to /Users/geri/work/scalding/scalding-hraven/target/scala-2.11/test-classes...
[info] Compiling 4 Scala sources to /Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-estimators-test/src/test/scala/com/twitter/scalding/reducer_estimation/RatioBasedEstimatorTest.scala:13: [TypeToType] Using toMap on something that is already of type Map.
[warn] override def config = super.config ++ customConfig.toMap.toMap
[warn] ^
[warn] /Users/geri/work/scalding/scalding-estimators-test/src/test/scala/com/twitter/scalding/reducer_estimation/ReducerEstimatorTest.scala:24: [TypeToType] Using toMap on something that is already of type Map.
[warn] override def config = super.config ++ customConfig.toMap.toMap
[warn] ^
[warn] /Users/geri/work/scalding/scalding-estimators-test/src/test/scala/com/twitter/scalding/reducer_estimation/ReducerEstimatorTest.scala:54: [TypeToType] Using toMap on something that is already of type Map.
[warn] override def config = super.config ++ customConfig.toMap.toMap
[warn] ^
[warn] /Users/geri/work/scalding/scalding-estimators-test/src/test/scala/com/twitter/scalding/reducer_estimation/ReducerEstimatorTest.scala:72: [TypeToType] Using toMap on something that is already of type Map.
[warn] override def config = super.config ++ customConfig.toMap.toMap
[warn] ^
[warn] /Users/geri/work/scalding/scalding-estimators-test/src/test/scala/com/twitter/scalding/reducer_estimation/ReducerEstimatorTest.scala:91: [TypeToType] Using toMap on something that is already of type Map.
[warn] override def config = super.config ++ customConfig.toMap.toMap
[warn] ^
[warn] /Users/geri/work/scalding/scalding-estimators-test/src/test/scala/com/twitter/scalding/reducer_estimation/ReducerEstimatorTest.scala:108: [TypeToType] Using toMap on something that is already of type Map.
[warn] override def config = super.config ++ customConfig.toMap.toMap
[warn] ^
[warn] /Users/geri/work/scalding/scalding-estimators-test/src/test/scala/com/twitter/scalding/reducer_estimation/ReducerEstimatorTest.scala:123: [TypeToType] Using toMap on something that is already of type Map.
[warn] override def config = super.config ++ customConfig.toMap.toMap
[warn] ^
[warn] /Users/geri/work/scalding/scalding-estimators-test/src/test/scala/com/twitter/scalding/reducer_estimation/ReducerEstimatorTest.scala:136: [TypeToType] Using toMap on something that is already of type Map.
[warn] override def config = super.config ++ customConfig.toMap.toMap
[warn] ^
[warn] 8 warnings found
[info] Compiling 3 Scala sources to /Users/geri/work/scalding/scalding-hadoop-test/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/test/scala/com/twitter/scalding/platform/PlatformExecutionTest.scala:23: [CloseSourceFile] You should close the file stream after use. (Streams get garbage collected, but it is possible to open too many at once)
[warn] .getLines()
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/test/scala/com/twitter/scalding/platform/PlatformExecutionTest.scala:47: [CloseSourceFile] You should close the file stream after use. (Streams get garbage collected, but it is possible to open too many at once)
[warn] .getLines()
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/test/scala/com/twitter/scalding/platform/PlatformExecutionTest.scala:53: [CloseSourceFile] You should close the file stream after use. (Streams get garbage collected, but it is possible to open too many at once)
[warn] .getLines()
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/test/scala/com/twitter/scalding/platform/PlatformTest.scala:128: [TypeToType] Using toString on something that is already of type String.
[warn] (t.toString, t)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/test/scala/com/twitter/scalding/platform/PlatformTest.scala:467: [TypeToType] Using toList on something that is already of type List.
[warn] .sink(typedRealOutput) { _.map{ f: Float => (f * 10).toInt }.toList shouldBe (outputData.map{ f: Float => (f * 10).toInt }.toList) }
[warn] ^
[warn] /Users/geri/work/scalding/scalding-hadoop-test/src/test/scala/com/twitter/scalding/platform/PlatformTest.scala:478: [TypeToType] Using toString on something that is already of type String.
[warn] .sink[String]("output") { _.toSet shouldBe data.map(_.toString).toSet }
[warn] ^
[warn] 6 warnings found
[info] Because /Users/geri/work/scalding/scalding-hadoop-test/src/test/scala/com/twitter/scalding/platform/PlatformTest.scala contains a macro definition, the following dependencies are invalidated unconditionally:
[info] /Users/geri/work/scalding/scalding-hadoop-test/src/test/scala/com/twitter/scalding/platform/TestJobsWithDescriptions.scala
[info] Compiling 9 Scala sources and 3 Java sources to /Users/geri/work/scalding/scalding-commons/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-commons/src/test/scala/com/twitter/scalding/WeightedPageRankTest.scala:47: [UseHeadNotApply] It is idiomatic to use expected.head instead of expected(0) for List
[warn] pageRank(1) shouldBe (expected(0)) +- 0.001
[warn] ^
[warn] /Users/geri/work/scalding/scalding-commons/src/test/scala/com/twitter/scalding/ExecutionKMeansTest.scala:35: [OperationAlwaysProducesZero] This subtraction will always return 0.
[warn] Vector.fill(k)(0.0).updated(cluster, 100.0) ++ Vector.fill(dim - k)(rng.nextDouble / (1e6 * dim))
[warn] ^
[warn] two warnings found
[warn] bootstrap class path not set in conjunction with -source 1.6
[info] Compiling 3 Scala sources and 8 Java sources to /Users/geri/work/scalding/scalding-parquet-scrooge/target/scala-2.11/classes...
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/scala/com/twitter/scalding/parquet/scrooge/Parquet346ScroogeScheme.scala:41: method setRecordConverterClass in object ThriftReadSupport is deprecated: see corresponding Javadoc for more information.
[warn] ThriftReadSupport.setRecordConverterClass(jobConf, classOf[Parquet346ScroogeRecordConverter[_]])
[warn] ^
[warn] one warning found
[warn] bootstrap class path not set in conjunction with -source 1.6
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/java/com/twitter/scalding/parquet/scrooge/ScroogeStructConverter.java:320: non-varargs call of varargs method with inexact argument type for last parameter;
[warn] cast to java.lang.Object for a varargs call
[warn] cast to java.lang.Object[] for a non-varargs call and to suppress this warning
[warn] Object result = listMethod.invoke(cObject, null);
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/java/com/twitter/scalding/parquet/scrooge/ScroogeStructConverter.java:386: non-varargs call of varargs method with inexact argument type for last parameter;
[warn] cast to java.lang.Object for a varargs call
[warn] cast to java.lang.Object[] for a non-varargs call and to suppress this warning
[warn] result.id = (Integer) valueMethod.invoke(rawScroogeEnum, null);
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/java/com/twitter/scalding/parquet/scrooge/ScroogeStructConverter.java:387: non-varargs call of varargs method with inexact argument type for last parameter;
[warn] cast to java.lang.Object for a varargs call
[warn] cast to java.lang.Object[] for a non-varargs call and to suppress this warning
[warn] result.originalName = (String) originalNameMethod.invoke(rawScroogeEnum, null);
[info] /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/java/com/twitter/scalding/parquet/scrooge/ParquetScroogeScheme.java: /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/java/com/twitter/scalding/parquet/scrooge/ParquetScroogeScheme.java uses or overrides a deprecated API.
[info] /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/java/com/twitter/scalding/parquet/scrooge/ParquetScroogeScheme.java: Recompile with -Xlint:deprecation for details.
[info] /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/java/com/twitter/scalding/parquet/scrooge/ScroogeStructConverter.java: Some input files use unchecked or unsafe operations.
[info] /Users/geri/work/scalding/scalding-parquet-scrooge/src/main/java/com/twitter/scalding/parquet/scrooge/ScroogeStructConverter.java: Recompile with -Xlint:unchecked for details.
[info] Compiling 2 Scala sources and 2 Java sources to /Users/geri/work/scalding/scalding-parquet-fixtures/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-parquet-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/thrift_scala/test/Address.scala:416: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/thrift_scala/test/Name.scala:418: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] two warnings found
[warn] bootstrap class path not set in conjunction with -source 1.6
[info] Compiling 11 Scala sources to /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestEnum.scala:122: [UnextendedSealedTrait] This sealed trait is never extended
[warn] sealed trait TestEnum extends ThriftEnum with Serializable
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/A.scala:501: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestEnum.scala:81: [UseGetOrElseNotPatMatch] ... match { Some(x) => x; None => TestEnum.this.EnumUnknownTestEnum.apply(value)} can be replaced with .getOrElse(TestEnum.this.EnumUnknownTestEnum.apply(value))
[warn] get(value) match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestLists.scala:1661: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestMaps.scala:1794: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestOptionTypes.scala:1018: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestSerializationOrder.scala:339: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestSerializationOrderItem.scala:324: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestSets.scala:1516: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestStruct.scala:511: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestTypes.scala:1021: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/A.scala:87: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: A): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/thrift/macros/scalathrift/TestOptionTypes.scala:166: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: TestOptionTypes): Unit = {
[warn] ^
[warn] 13 warnings found
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] + TypeDescriptor.roundTrip.(Int, Double, String, Option[(String, Int, Option[Long])]): OK, passed 100 tests.
[info] + TypeDescriptor.roundTrip.Option[(Option[Boolean], Int, String, Option[Long])]: OK, passed 100 tests.
[info] + TypeDescriptor.roundTrip.Option[Int]: OK, passed 100 tests.
[info] + TypeDescriptor.roundTrip.Option[(Int, String, Option[Long])]: OK, passed 100 tests.
[info] + TypeDescriptor.roundTrip.Int: OK, passed 100 tests.
[info] + Config..+(k, v).get(k) == Some(v): OK, passed 100 tests.
[info] + Config.adding many UniqueIDs works: OK, passed 100 tests.
[info] + Config..-(k).get(k) == None: OK, passed 100 tests.
[info] + Config.++ unions keys: OK, passed 100 tests.
[info] + Config.++ == c2.orElse(c1): OK, passed 100 tests.
[info] + Matrix2.evaluate function returns the same cost as optimize: OK, passed 100 tests.
[info] + Matrix2.a cost of an optimized chain of matrix products is <= a random one: OK, passed 100 tests.
[info] + Matrix2.cost of a random plan is <= a random one: OK, passed 100 tests.
[info] + Matrix2.optimizing an optimized plan does not change it: OK, passed 100 tests.
[info] + RichPipe.assignName carries over the old number if it was already an assigned name: OK, passed 100 tests.
[info] + RichPipe.assignName carries over the last 12 characters of the old name if it's more than 12 characters: OK, passed 100 tests.
[info] + RichPipe.assignName carries over the whole old name if it's 12 characters or less: OK, passed 100 tests.
[info] + RichPipe.assignName carries over the last (12-hexdigits) group from the *last* UUID if the old name included more than one: OK, passed 100 tests.
[info] + RichPipe.assignName carries over the over the old number if it was already an assigned name carrying bits from a UUID: OK, passed 100 tests.
[info] + RichPipe.assignName carries over the last (12-hexdigits) group from the UUID if the old name included one: OK, passed 100 tests.
[info] + CoGrouped.DistinctBy.distinctBy to unit gives size 0 or 1: OK, passed 100 tests.
[info] + CoGrouped.DistinctBy.distinctBy works like groupBy(fn).map(_._2.head).toSet: OK, passed 100 tests.
[info] + CoGrouped.DistinctBy.distinctBy.size == map(fn).toSet.size: OK, passed 100 tests.
[info] + CoGrouped.DistinctBy.distinctBy never increases size: OK, passed 100 tests.
[info] + CoGrouped.DistinctBy.distinctBy to different values never changes the list: OK, passed 100 tests.
[info] + CoGrouped.DistinctBy.distinctBy matches a mutable implementation: OK, passed 100 tests.
[info] + SizeHint.Hadamard product does not increase sparsity fraction: OK, passed 100 tests.
[info] + SizeHint.transpose preserves size: OK, passed 100 tests.
[info] + SizeHint.ordering makes sense: OK, passed 100 tests.
[info] + SizeHint.diagonals are smaller: OK, passed 100 tests.
[info] + SizeHint.a#*#b is at most as big as a: OK, passed 100 tests.
[info] + SizeHint.hadamard product of a finite hint to itself preserves size: OK, passed 100 tests.
[info] + SizeHint.adding a sparse matrix to itself doesn't decrease size: OK, passed 100 tests.
[info] + SizeHint.adding a finite hint to itself preserves size: OK, passed 100 tests.
[info] + SizeHint.squaring a finite hint preserves size: OK, passed 100 tests.
[info] + SizeHint.diagonals are about as big as the min(rows,cols): OK, passed 100 tests.
[info] + SizeHint.a+b is at least as big as a: OK, passed 100 tests.
[info] + SizeHint.transpose law is obeyed in total: OK, passed 100 tests.
[info] + SizeHint.addition increases sparsity fraction: OK, passed 100 tests.
log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
[info] + ExecutionApp Properties.Non-hadoop random args will all end up in the right bucket: OK, passed 100 tests.
[info] + ExecutionApp Properties.adding an hadoop lib jars in the middle will extract it right: OK, passed 100 tests.
[info] + ExecutionApp Properties.adding an hadoop -D parameter in the middle will extract it right: OK, passed 100 tests.
[info] CoGroupTest:
[info] A StarJoinJob
[info] - should be able to work
Cascade{flowStatsList=[Flow{status=SUCCESSFUL, startTime=1502742778360, duration=994, stepsCount=1}, Flow{status=SUCCESSFUL, startTime=1502742779357, duration=265, stepsCount=1}]}
Cascade{flowStatsList=[Flow{status=SUCCESSFUL, startTime=1502742779757, duration=204, stepsCount=1}, Flow{status=SUCCESSFUL, startTime=1502742779961, duration=204, stepsCount=1}]}
[info] TwoPhaseCascadeTest:
[info] A Cascade job
[info] - should verify output got changed by both flows
[info] A Cascade job run though Tool.main
[info] - should verify output got changed by both flows
[info] StatsTest:
[info] StatsTestJob
[info] - should pass if verifyCounters() is true
[info] StatsTestJob
[info] - should fail if verifyCounters() is false
[info] StatsTestJob
[info] - should skip verifyCounters() if job fails
[info] StatsTestJob
[info] - should skip verifyCounters() if verifyCountersInTest is false
[info] PathFilterTest:
[info] RichPathFilter
[info] - should compose ands
[info] - should compose ors
[info] - should negate nots
[info] DelimitedPartitionSourceTest:
[info] PartitionedTsv fed a DelimitedPartition
[info] - should split output by the delimited path
[info] CustomPartitionSourceTest:
[info] PartitionedTsv fed a CustomPartition
[info] - should split output by the custom path
[info] PartialPartitionSourceTest:
[info] PartitionedTsv fed a DelimitedPartition and only a subset of fields
[info] - should split output by the delimited path, discarding the unwanted fields
[info] LookupJoinedTest:
[info] A LookupJoinerJob
[info] - should correctly lookup
[info] WindowLookupJoinedTest:
[info] A WindowLookupJoinerJob
[info] - should correctly lookup
[info] ExecutionUtilTest:
[info] ExecutionUtil
[info] - should run multiple jobs
[info] - should run multiple jobs with executions
[info] - should run multiple jobs with executions and sum results
[info] SmoothedHistoryMemoryEstimatorTest:
[info] A memory history estimator
[info] - should return None without history
[info] - should estimate correct numbers for only reducers
[info] - should estimate correct numbers for only mappers
[info] - should estimate correct numbers
[info] - should estimate less than max cap
[info] - should estimate not less than min cap
[info] ExpandLibJarsGlobsTest:
[info] ExpandLibJarsGlobs
[info] - should expand entries
[info] - should Skips over unmatched entries
[info] - should Multiple paths in libjars
[info] TypedPipeCheckerTest:
[info] TypedPipeChecker
[info] - should run asserts on pipe
[info] TypedPipeChecker
[info] - should give back a list
[info] TypedPipeChecker
[info] - should allow for a list of input to be run through a transform function
[info] HashArrayEqualsWrapperProps:
[info] - Specialized orderings obey all laws for Arrays
[info] - Specialized orderings obey all laws for wrapped Arrays
[info] HashArrayEqualsWrapperTest:
[info] - wrap function returns correct wrapper
[info] - classForTag works correctly
[info] TypedFieldsTest:
[info] A fields API job
[info] - should throw an exception if a field is not comparable
[info] - should group by custom comparator correctly
['male', '184.1', '1']
['male', '165.2', '2']
['male', '125.4', '3']
['female', '172.2', '1']
['female', '128.6', '2']
tuples count: 5
[info] ScanLeftTest:
[info] A simple ranking scanleft job
[info] - should produce correct number of records when filtering out null values
[info] - should create correct ranking per group, 1st being the heighest person of that group
[info] TupleAdderTest:
[info] A TupleAdderJob
[info] - should be able to use generated tuple adders
(and,1)
(hack,4)
(and,1)
(hack,4)
[info] TypedPipeTest:
[info] A TypedPipe
[info] - should 0: count words correctly
[info] - should 1: count words correctly
[info] TypedSumByKeyTest:
[info] A TypedSumByKeyPipe
[info] - should 0: count words correctly
[info] - should 1: count words correctly
[info] TypedPipeMonoidTest:
[info] typedPipeMonoid.zero
[info] - should be equal to TypePipe.empty
[info] TypedPipeSortByTest:
[info] - groups should not be disturbed by sortBy
[info] TypedPipeJoinTest:
[info] A TypedPipeJoin
[info] - should correctly join
[info] TypedPipeJoinKryoTest:
[info] OpaqueJoinBox
[info] - should not be serializable
[info] - should closure not be serializable
[info] A TypedPipeJoinKryo
[info] - should correctly join
[info] TypedPipeDistinctTest:
[info] A TypedPipeDistinctJob
[info] - should correctly count unique item sizes
[info] TypedPipeDistinctByTest:
[info] A TypedPipeDistinctByJob
[info] - should correctly count unique item sizes
[info] TypedPipeGroupedDistinctJobTest:
[info] A TypedPipeGroupedDistinctJob
[info] - should correctly generate unique items
[info] - should correctly count unique item sizes
[info] TypedPipeHashJoinTest:
[info] A TypedPipeHashJoinJob
[info] - should correctly join
[info] TypedPipeTypedTest:
[info] A TypedImplicitJob
[info] - should find max word
Dumping custom counters:
onCompleteMapper 1
onCompleteReducer 1
[info] TypedPipeWithOnCompleteTest:
[info] A TypedWithOnCompleteJob
[info] - should have the correct output
[info] - should have onComplete called on mapper
[info] - should have onComplete called on reducer
[info] TypedPipeWithOuterAndLeftJoinTest:
[info] A TypedPipeWithOuterAndLeftJoin
[info] - should have output for user 1
[info] - should have output for user 5
[info] - should not have output for user 99
[info] TypedPipeJoinCountTest:
[info] A com.twitter.scalding.TJoinCountJob
[info] - should 0: correctly reduce after cogroup
[info] - should 1: correctly do a simple join
[info] - should 2: correctly do a simple leftJoin
[info] - should 3: correctly reduce after cogroup
[info] - should 4: correctly do a simple join
[info] - should 5: correctly do a simple leftJoin
[info] A com.twitter.scalding.TNiceJoinCountJob
[info] - should 0: correctly reduce after cogroup
[info] - should 1: correctly do a simple join
[info] - should 2: correctly do a simple leftJoin
[info] - should 3: correctly reduce after cogroup
[info] - should 4: correctly do a simple join
[info] - should 5: correctly do a simple leftJoin
[info] A com.twitter.scalding.TNiceJoinByCountJob
[info] - should 0: correctly reduce after cogroup
[info] - should 1: correctly do a simple join
[info] - should 2: correctly do a simple leftJoin
[info] - should 3: correctly reduce after cogroup
[info] - should 4: correctly do a simple join
[info] - should 5: correctly do a simple leftJoin
[info] TypedPipeCrossTest:
[info] A TCrossJob
[info] - should 0: create a cross-product
[info] - should 1: create a cross-product
[info] TypedJoinTakeTest:
[info] A TJoinTakeJob
[info] - should 0: dedup keys by using take
[info] - should 1: dedup keys by using take
[info] TypedGroupAllTest:
[info] A TGroupAllJob
[info] - should 0: create sorted output
[info] - should 1: create sorted output
[info] TSelfJoinTest:
[info] A TSelfJoin
[info] TypedJoinWCTest:
[info] A TJoinWordCount
[info] - should create sorted output
[info] TypedLimitTest:
[info] A TypedLimitJob
[info] - should not have more than the limited outputs
[info] TypedFlattenTest:
[info] A TypedLimitJob
[info] - should correctly flatten
[info] TypedMergeTest:
[info] A TypedMergeJob
[info] - should 0: correctly flatten
[info] - should 1: correctly flatten
[info] TypedShardTest:
[info] A TypedShardJob
[info] - should correctly flatten
[info] TypedLocalSumTest:
[info] A TypedLocalSumJob
[info] - should 0: not expand and have correct total sum
[info] - should 1: not expand and have correct total sum
[info] TypedHeadTest:
[info] A TypedHeadJob
[info] - should correctly take the first
[info] TypedSortWithTakeTest:
[info] A TypedSortWithTakeJob
[info] - should correctly take the first
[info] - should correctly take the first using sorted.reverse.take
[info] TypedLookupJobTest:
[info] A TypedLookupJob
[info] - should correctly TypedPipe.hashLookup
[info] TypedLookupReduceJobTest:
[info] A TypedLookupJob
[info] - should correctly TypedPipe.hashLookup
[info] TypedFilterTest:
[info] A TypedPipe
[info] - should filter and filterNot elements
[info] TypedPartitionTest:
[info] A TypedPipe
[info] - should partition elements
[info] TypedMultiJoinJobTest:
[info] A TypedMultiJoinJob
[info] - should correctly do a multi-join
[info] TypedMultiSelfJoinJobTest:
[info] A TypedMultiSelfJoinJob
[info] - should correctly do a multi-self-join
[info] TypedMapGroupTest:
[info] A TypedMapGroup
[info] - should correctly do a mapGroup
[info] TypedSelfCrossTest:
[info] A TypedSelfCrossJob
[info] - should 0: not change the length of the input
[info] - should 1: not change the length of the input
[info] TypedSelfLeftCrossTest:
[info] A TypedSelfLeftCrossJob
[info] - should 0: attach the sum of all values correctly
[info] - should 1: attach the sum of all values correctly
[info] JoinMapGroupJobTest:
[info] A JoinMapGroupJob
[info] - should not duplicate keys
[info] MapValueStreamNonEmptyIteratorTest:
[info] A MapValueStreamNonEmptyIteratorJob
[info] - should not have iterators of size 0
[info] NullSinkJobTest:
[info] A NullSinkJob
[info] - should have a side effect
[info] TypedSketchJoinJobTest:
[info] A TypedSketchJoinJob
[info] - should get the same result as an inner join
[info] - should get the same result when half the left keys are missing
[info] - should get the same result with a massive skew to one key
[info] - should still work with only one reducer
[info] - should still work with massive skew and only one reducer
[info] TypedSketchLeftJoinJobTest:
[info] A TypedSketchLeftJoinJob
[info] - should get the same result as a left join
[info] - should get the same result when half the left keys are missing
[info] - should get the same result with a massive skew to one key
[info] - should still work with only one reducer
[info] - should still work with massive skew and only one reducer
[info] IntegralCompTest:
[info] IntegralComparator
[info] - should recognize integral types
[info] - should handle null inputs
[info] - should have consistent hashcode
[info] - should Compare strings properly
[info] MemoryTest:
[info] - basic word count
[info] - mapGroup works
[info] - hashJoin works
[info] SourceSpec:
[info] A case class Source
[info] - should inherit equality properly from TimePathedSource
[info] A Source with overriden transformForRead and transformForWrite
[info] - should respect these overrides even for tests
[info] TypedPipeAndThenTest:
[info] Mappable.andThen is like TypedPipe.map
[info] - should TypedPipe return proper results
[info] - should Mappable.andThen return proper results
[info] JobTestTest:
[info] A JobTest
[info] - should error helpfully when a source in the job doesn't have a corresponding .source call
[info] MutatedSourceTest:
[info] A MutatedSourceJob
[info] - should Not throw when using a converted source
[info] ContraMappedAndThenSourceTest:
[info] A ContraMappedAndThenSourceJob
[info] - should Not throw when using a converted source
[info] TypedSimilarityTest:
[info] A TypedCosineJob
[info] - should compute cosine similarity
[info] - should compute dimsum cosine similarity
[info] MultiJoinTest:
[info] The flatten methods
[info] - should actually match the outputs of joins
[info] - should Have implicit flattenValueTuple methods for low arity
[info] TypedSketchJoinJobForEmptyKeysTest:
[info] A TypedSketchJoinJobForEmptyKeysTest
[info] - should Sketch leftJoin with a single left key should be correct
[info] MacrosUnitTests:
[info] MacroGenerated TupleConverter
[info] - should Not compile for Option[Option[Int]]
[info] MacroGenerated TupleSetter
[info] - should Generate the setter SampleClassA
[info] - should Generate the setter SampleClassB
[info] - should Generate the setter SampleClassC
[info] - should Generate the setter SampleClassD
[info] - should Generate the setter SampleClassE
[info] - should Generate the setter SampleClassF
[info] - should Generate the setter SampleClassG
[info] - should be serializable for case class A
[info] - should be serializable for case class B
[info] - should be serializable for case class C
[info] - should be serializable for case class D
[info] - should be serializable for case class E
[info] - should be serializable for case class F
[info] MacroGenerated TupleConverter
[info] - should Generate the converter SampleClassA
[info] - should Generate the converter SampleClassB
[info] - should Generate the converter SampleClassC
[info] - should Generate the converter SampleClassD
[info] - should Generate the converter SampleClassE
[info] - should Generate the converter SampleClassF
[info] - should Generate the converter SampleClassG
[info] - should Generate the converter Option[(Int, String)]
[info] - should Generate the converter Option[(Int, Option[(Long, String)])]
[info] - should Not generate a convertor for SampleClassFail
[info] - should be serializable for case class A
[info] - should be serializable for case class B
[info] - should be serializable for case class C
[info] - should be serializable for case class D
[info] - should be serializable for case class E
[info] - should be serializable for case class F
[info] MacroGenerated TypeDescriptor
[info] - should Generate the converter SampleClassA
[info] - should Generate the converter SampleClassB
[info] - should Generate the converter SampleClassC
[info] - should Generate the converter SampleClassD
[info] - should Generate the converter SampleClassE
[info] - should Generate the converter SampleClassF
[info] - should Generate the converter SampleClassG
[info] - should Not generate a convertor for SampleClassFail
[info] - should be serializable for case class A
[info] - should be serializable for case class B
[info] - should be serializable for case class C
[info] - should be serializable for case class D
[info] - should be serializable for case class E
[info] - should be serializable for case class F
[info] MacroGenerated TupleSetter and TupleConverter
[info] - should round trip class -> tupleentry -> class
[info] - should Case Class should form expected tuple
[info] - should round trip tupleentry -> class -> tupleEntry
[info] - should Case Class should form expected Fields
[info] - should Case Class should form expected Fields with Options
[info] - should Case Class should form expected Fields with Unknown types
[info] - should Case Class should form expected Indexed Fields
[info] Matrix2Test:
[info] A MatrixSum job
[info] - should correctly compute sums
[info] A MatrixSum job with Orderedserialization
[info] - should correctly compute sums
[info] A Matrix2Sum3 job, where the Matrix contains tuples as values,
[info] - should correctly compute sums
[info] A Matrix2SumChain job
[info] - should correctly compute sums
[info] A Matrix2HadSum job
[info] - should correctly compute a combination of a Hadamard product and a sum
[info] A Matrix2 RowRowHad job
[info] - should correctly compute a Hadamard product of row vectors
[info] A Matrix2 ZeroHad job
[info] - should correctly compute a Hadamard product with a zero matrix
[info] A Matrix2Prod job
[info] - should correctly compute products
[info] A Matrix2JProd job
[info] - should correctly compute products with infinite matrices
[info] A Matrix2ProdSum job
[info] - should correctly compute products
[info] A Matrix2 Propagation job
[info] - should correctly propagate columns
[info] - should correctly propagate rows
[info] A Matrix2 Cosine job
[info] - should correctly compute cosine similarity
[info] A Matrix2 Normalize job
[info] - should correctly compute l1 normalization for matrix with double values
[info] - should correctly compute l1 normalization for matrix with long values
[info] A Matrix2 Scalar2Ops job
[info] - should correctly compute M * 3
[info] - should correctly compute M / 3
[info] - should correctly compute 3 * M
[info] - should correctly compute M * Tr(M)
[info] - should correctly compute Tr(M) * M
[info] - should correctly compute M / Tr(M)
[info] MemoryEstimatorStepStrategyTest:
[info] A Memory estimator step strategy
[info] - should set xmx settings correctly
[info] - should set xmx settings correctly with empty original config
[info] MultipleSourcesSpecTest:
[info] A test with two sources
[info] - should accept an operation with two input pipes
[info] - should accept an operation with two input pipes using Tuples
[info] A test with three sources
[info] - should accept an operation with three input pipes
[info] A test with four sources
[info] - should compile mixing an operation with inconsistent number of input pipes but fail at runtime
[info] - should be used with a function accepting a list of sources because there is no implicit for functions with more than three input pipes
[info] SingleSourceSpecTest:
[info] A test with single source
[info] - should accept an operation with a single input rich pipe
[info] - should accept an operation with a single input pipe
[info] - should work with output as Tuple
[info] - should work with input as simple type
[info] - should work with input as Tuple
[info] StringUtilityTest:
[info] fastSplitTest
[info] - should be able to split white space
[info] - be able to split other separators
[info] - be able to split only one separators
[info] - be able to split when separator doesn't show up
[info] StringUtilityPropertyTest:
[info] - fastSplit(s, sep) should match s.split(sep, -1) for non-regex sep
[info] ReduceOperationsTest:
[info] A sortWithTake job
[info] - should grouped list
[info] A sortedTake job
[info] - should grouped list
[info] A sortedReverseTake job
[info] - should grouped list
[info] An approximateUniqueCount job
[info] - should grouped OS count
[info] RequireOrderedSerializationTest:
[info] A NoOrderedSerJob
[info] - should throw when run
[info] A OrderedSerJob
[info] - should run
[info] ConfigTest:
[info] A Config
[info] - should cascadingAppJar works
[info] - should default has serialization set
[info] - should default has chill configured
[info] - should setting timestamp twice does not change it
[info] - should adding UniqueIDs works
[info] - should roundtrip Args
[info] - should throw when Args has been manually modified
[info] - should Default serialization should have tokens
[info] - should addDistributedCacheFile works
[info] - should multiple addDistributedCacheFile work
[info] PackTest:
[info] A ContainerPopulationJob
[info] - should correctly populate container objects
[info] A ContainerToPopulationJob
[info] - should correctly populate container objects
[info] - should correctly populate container case class objects
[info] A FatContainerPopulationJob
[info] - should correctly populate a fat container object
[info] A FatContainerToPopulationJob
[info] - should correctly populate a fat container object
[info] TypedPipeDiffTest:
[info] - diff works for objects with ordering and good hashcodes
[info] - diffArrayPipes works for arrays
[info] - diffWithoutOrdering works for objects with ordering and good hashcodes
[info] - diffWithoutOrdering does not require ordering
[info] - diffWithoutOrdering works even with hash collisions
[info] - diffArrayPipesWithoutOrdering works for arrays of objects with no ordering
[info] TypedPipeDiffLaws:
[info] - diffLaws
[info] - diffArrayLaws
[info] - diffByGroupLaws
[info] SourceListSpecTest:
[info] A test with a list of sources
[info] - should compile mixing it with a multi pipe function but fail if not same cardinality between given and when clause
[info] - should work properly with a multi rich-pipe function with same cardinality
[info] - should work properly with a multi pipe function with same cardinality
[info] - should work properly with a function accepting a list of rich pipes
[info] - should work properly with a function accepting a list of pipes
[info] MatrixTest:
[info] A MatrixProd job
[info] - should correctly compute products
[info] A MatrixBlockProd job
[info] - should correctly compute block products
[info] A MatrixSum job
[info] - should correctly compute sums
[info] A MatrixSum job, where the Matrix contains tuples as values,
[info] - should correctly compute sums
[info] A Matrix Randwalk job
[info] - should correctly compute matrix randwalk
[info] A Matrix Cosine job
[info] - should correctly compute cosine similarity
[info] A Matrix Covariance job
[info] - should correctly compute matrix covariance
[info] A Matrix VctProd job
[info] - should correctly compute vector inner products
[info] A Matrix VctDiv job
[info] - should correctly compute vector element-wise division
[info] A Matrix ScalarOps job
[info] - should correctly compute M * 3
[info] - should correctly compute 3 * M
[info] - should correctly compute M / 3
[info] - should correctly compute M * Tr(M)
[info] - should correctly compute Tr(M) * M
[info] - should correctly compute M / Tr(M)
[info] A Matrix Diagonal job
[info] - should correctly compute diag * matrix
[info] - should correctly compute diag * diag
[info] - should correctly compute matrix * diag
[info] - should correctly compute diag * col
[info] - should correctly compute row * diag
[info] A Propagation job
[info] - should correctly propagate columns
[info] - should correctly propagate rows
[info] A MapWithIndex job
[info] - should correctly mapWithIndex on Row
[info] - should correctly mapWithIndex on Matrix
[info] A Matrix RowMatProd job
[info] - should correctly compute a new row vector
[info] A Matrix MatColProd job
[info] - should correctly compute a new column vector
[info] A Matrix RowRowDiff job
[info] - should correctly subtract row vectors
[info] A Matrix VctOuterProd job
[info] - should correctly compute the outer product of a column and row vector
[info] A Matrix RowRowSum job
[info] - should correctly add row vectors
[info] A Matrix RowRowHad job
[info] - should correctly compute a Hadamard product of row vectors
[info] A FilterMatrix job
[info] - should correctly remove elements
[info] - should correctly keep elements
[info] A KeepRowsCols job
[info] - should correctly keep row vectors
[info] - should correctly keep col vectors
[info] A RemoveRowsCols job
[info] - should correctly keep row vectors
[info] - should correctly keep col vectors
[info] A Scalar Row Right job
[info] - should 0: correctly compute a new row vector
[info] - should 1: correctly compute a new row vector
[info] A Scalar Row Left job
[info] - should 0: correctly compute a new row vector
[info] - should 1: correctly compute a new row vector
[info] A Scalar Col Right job
[info] - should 0: correctly compute a new col vector
[info] - should 1: correctly compute a new col vector
[info] A Scalar Col Left job
[info] - should 0: correctly compute a new col vector
[info] - should 1: correctly compute a new col vector
[info] A Scalar Diag Right job
[info] - should 0: correctly compute a new diag matrix
[info] - should 1: correctly compute a new diag matrix
[info] A Scalar Diag Left job
[info] - should 0: correctly compute a new diag matrix
[info] - should 1: correctly compute a new diag matrix
[info] A Col Normalizing job
[info] - should 0: correctly compute a new col vector
[info] - should 1: correctly compute a new col vector
[info] A Col Diagonal job
[info] - should correctly compute the size of the diagonal matrix
[info] A Row Normalizing job
[info] - should 0: correctly compute a new row vector
[info] - should 1: correctly compute a new row vector
[info] TimePathedSourceTest:
[info] TimePathedSource.hdfsWritePath
[info] - should crib if path == /*
[info] - should crib if path doesn't end with /*
[info] - should work for path ending with /*
[info] NumberJoinTest:
[info] A NumberJoinerJob
[info] - should not throw when joining longs with ints
[info] SpillingTest:
[info] A SpillingJob
[info] - should work when number of keys exceeds spill threshold
[info] GroupRandomlyJobTest:
[info] A GroupRandomlyJob
[info] ShuffleJobTest:
[info] A ShuffleJob
[info] MapToGroupBySizeSumMaxTest:
[info] A MapToGroupBySizeSumMaxJob
[info] - should produce correct size, sum, max
[info] PartitionJobTest:
[info] A PartitionJob
[info] MRMTest:
[info] A MRMJob
[info] - should use reduce to compute xor
[info] - should use mapReduceMap to round-trip input
[info] - should use flattenTo
[info] JoinTest:
[info] A JoinJob
[info] - should join tuples with the same key
[info] CollidingKeyJoinTest:
[info] A CollidingKeyJoinJob
[info] - should join tuples with the same key
[info] TinyJoinTest:
[info] A TinyJoinJob
[info] - should 0: join tuples with the same key
[info] - should 1: join tuples with the same key
[info] TinyCollisionJoinTest:
[info] A TinyCollisionJoinJob
[info] - should join tuples with the same key
[info] TinyThenSmallJoinTest:
[info] A TinyThenSmallJoin
[info] - should 0: join tuples with the same key
[info] - should 1: join tuples with the same key
(a,1,NULL)
(b,2,-1)
(c,3,5)
(a,1,NULL)
(b,2,-1)
(c,3,5)
[info] LeftJoinTest:
[info] A LeftJoinJob
[info] - should 0: join tuples with the same key
[info] - should 1: join tuples with the same key
(a,1,NULL)
(b,2,-1)
(c,3,5)
(a,1,NULL)
(b,2,-1)
(c,3,5)
[info] LeftJoinWithLargerTest:
[info] A LeftJoinWithLargerJob
[info] - should 0: join tuples with the same key
[info] - should 1: join tuples with the same key
[info] MergeTest:
[info] A MergeTest
[info] - should correctly merge two pipes
[info] - should correctly self merge
[info] SizeAveStdSpec:
[info] A sizeAveStd job
[info] - should correctly compute size, ave, stdev
[info] DoubleGroupSpec:
[info] A DoubleGroupJob
[info] - should correctly build histogram
[info] GroupUniqueSpec:
[info] A GroupUniqueJob
[info] - should correctly count unique sizes
[info] DiscardTest:
[info] A DiscardTestJob
[info] - should must reduce down to one line
[info] - should must correctly discard word column
[info] HistogramTest:
[info] A HistogramJob
[info] - should must reduce down to a single line for a trivial input
[info] - should must get the result right
[info] ForceReducersTest:
[info] A ForceReducersJob
[info] - should 0: must get the result right
[info] - should 1: must get the result right
[info] ToListTest:
[info] A ToListJob
[info] - should must have the right number of lines
[info] - should must get the result right
[info] A NullListJob
[info] - should must have the right number of lines
[info] - should must return an empty list for null key
[info] CrossTest:
[info] A CrossJob
[info] - should 0: must look exactly right
[info] - should 1: must look exactly right
[info] GroupAllCrossTest:
[info] A GroupAllCrossJob
[info] - should 0: must look exactly right
[info] - should 1: must look exactly right
[info] SmallCrossTest:
[info] A SmallCrossJob
[info] - should 0: must look exactly right
[info] - should 1: must look exactly right
[info] TopKTest:
[info] A TopKJob
[info] - should must look exactly right
[info] ScanTest:
[info] A ScanJob
[info] - should 0: have a working scanLeft
[info] - should 1: have a working scanLeft
[info] TakeTest:
[info] A TakeJob
[info] - should groupAll must see everything in same order
[info] - should take(2) must only get 2
[info] DropTest:
[info] A DropJob
[info] - should groupAll must see everything in same order
[info] - should drop(2) must only get 1
[info] PivotTest:
[info] A PivotJob
[info] - should unpivot columns correctly
[info] - should pivot back to the original
[info] - should pivot back to the original with the missing column replace by the specified default
[info] IterableSourceTest:
[info] A IterableSourceJob
[info] - should 0: Correctly joinWithSmaller
[info] - should 1: correctly joinWithTiny
[info] - should 2: correctly implicitly joinWithTiny
[info] - should 3: Correctly joinWithSmaller
[info] - should 4: correctly joinWithTiny
[info] - should 5: correctly implicitly joinWithTiny
[info] HeadLastTest:
[info] A HeadLastJob
[info] - should Correctly do head/last
[info] HeadLastUnsortedTest:
[info] A HeadLastUnsortedTest
[info] - should Correctly do head/last
[info] MkStringToListTest:
[info] A IterableSourceJob
[info] - should Correctly do mkString/toList
[info] InsertJobTest:
[info] An InsertJob
[info] - should Correctly insert a constant
[info] FoldJobTest:
[info] A FoldTestJob
[info] - should Correctly do a fold with MutableSet
[info] InnerCaseTest:
[info] An InnerCaseJob
[info] - should Correctly handle inner case classes
[info] NormalizeTest:
[info] A NormalizeJob
[info] - should must be normalized
[info] ForceToDiskTest:
[info] A ForceToDiskJob
[info] - should 0: run correctly when combined with joinWithTiny
[info] - should 1: run correctly when combined with joinWithTiny
[info] ItsATrapTest:
[info] An AddTrap
[info] - should must contain all numbers in input except for 1
[info] - should must contain all 1s and fields in input
[info] TypedItsATrapTest:
[info] A Typed AddTrap with many traps
[info] - should output must contain all odd except first
[info] - should trap1 must contain only the first
[info] - should trap2 must contain the even numbered
[info] A Typed AddTrap with many erroneous maps
[info] - should output must contain all odd except first
[info] - should trap must contain the first and the evens
[info] GroupAllToListTest:
[info] A GroupAllToListTestJob
[info] - should must properly aggregate stuff into a single map
[info] ToListGroupAllToListSpec:
[info] A ToListGroupAllToListTestJob
List((us,List(1)), (jp,List(3, 2)), (gb,List(3, 1)))
[info] - should must properly aggregate stuff in hadoop mode
List((us,List(1)), (jp,List(3, 2)), (gb,List(3, 1)))
[info] - should must properly aggregate stuff in local model
[info] Function2Test:
[info] A Function2Job
[info] - should convert a function2 to tupled function1
[info] SampleWithReplacementTest:
[info] A SampleWithReplacementJob
[info] - should sampleWithReplacement must sample items according to a poisson distribution
[info] VerifyTypesJobTest:
[info] Verify types operation
[info] - should put bad records in a trap
[info] SortingJobTest:
[info] A SortingJob
[info] - should keep all the columns
[info] CollectJobTest:
[info] A CollectJob
[info] FilterJobTest:
[info] A FilterJob
[info] FilterNotJobTest:
[info] A FilterNotJob
[info] CounterJobTest:
[info] A CounterJob
Dumping custom counters:
foo_bar 10
reduce_hit 2
age_group_older_than_18 3
[info] - should have the right counter and output values
[info] DailySuffixTsvTest:
[info] A DailySuffixTsv Source
[info] - should read and write data
[info] PartitionedTextLineTest:
[info] PartitionedTextLine
/tmp/scalding/a75ba3c6-6681-4841-9bd3-0f05a61487e2/com.twitter.scalding.typed.PartitionedTextLine521782338
[info] - should be able to split output by a single partition
/tmp/scalding/373a525f-a92c-41fa-bf8a-82d502fbc64a/com.twitter.scalding.typed.PartitionedTextLine856946891
[info] - should be able to split output by multiple partitions
[info] TypedApiTest:
[info] A test with a single source
[info] - should accept an operation from working with a single tuple-typed pipe
[info] - should accept an operation from single case class-typed pipe
[info] A test with a two sources
[info] - should accept an operation from two tuple-typed pipes
[info] - should accept an operation from two case classes-typed pipes
[info] A test with a list of sources
[info] - should Work as if combining the sources with the And operator but requires explicit cast of the input pipes
[info] - should not checking the types of the sources and fail if any error occurs
[info] - should be created when adding a source to four sources
[info] DistributedCacheFileSpec:
[info] DistributedCacheFile
[info] - should symlinkNameFor must return a hashed name
[info] KryoTest:
[info] KryoSerializers and KryoDeserializers
[info] - should round trip for KryoHadoop
[info] - should round trip any non-array object
[info] - should handle arrays
[info] - should handle scala singletons
[info] - should handle Date, RichDate and DateRange
[info] - should Serialize a giant list
[info] TestTapFactoryTest:
[info] A test tap created by TestTapFactory
[info] - should error helpfully when a source is not in the map for test buffers
[info] HistogramJobTest:
[info] A HistogramJob
[info] - should correctly compute the min
[info] - should correctly compute the max
[info] - should correctly compute the sum
[info] - should correctly compute the mean
[info] - should correctly compute the stdDev
[info] - should correctly compute a CDF
[info] ExecutionTest:
[info] An Execution
[info] - should run
[info] - should run with zip
[info] - should lift to try
[info] - should lift to try on exception
[info] - should merge fanouts without error
[info] - should If either fails, zip fails, else we get success
[info] - should Config transformer will isolate Configs
[info] - should Config transformer will interact correctly with the cache
[info] - should Config transformer will interact correctly with the cache when writing
[info] - should correctly add cached file into config
[info] - should correctly add cached files into config
[info] ExecutionApp
[info] - should parse hadoop args correctly
[info] An ExecutionJob
[info] - should run correctly
[info] Executions
[info] - should shutdown hook should clean up temporary files
[info] - should clean up temporary files on exit
[info] - should clean up temporary files on exit with a zip
[info] - should evaluate once per run
[info] - should zip does not duplicate counters
[info] - should Running a large loop won't exhaust boxed instances
[info] - should evaluate shared portions just once, writeExecution
[info] - should evaluate shared portions just once, forceToDiskExecution
[info] - should evaluate shared portions just once, forceToDiskExecution with execution cache
[info] - should Ability to do isolated caches so we don't exhaust memory
[info] - should handle failure
[info] - should handle an error running in parallel
[info] - should run in parallel
[info] - should block correctly
[info] - should can hashCode, compare, and run a long sequence
[info] - should caches a withId Execution computation
[info] should maintains equality and hashCode after reconstruction
[info] - when Execution.fromFuture
[info] - when Execution.fromFn
[info] - when Execution.withId
[info] - when Execution#map
[info] - when Execution.zip
[info] - when Execution.sequence
[info] should Has consistent hashCode and equality for mutable
[info] - when Execution.fromFuture
[info] - when Execution.fromFn
[info] - when Execution.withId
[info] - when Execution#map
[info] - when Execution#zip
[info] - when Execution.sequence
[info] ArgHelpTest:
[info] ArgHelper
[info] - should print help when asked
[info] ArgHelper
[info] - should run job without help
[info] ArgHelper
[info] - should call help even when given missing args
[info] ArgHelper
[info] - should not fail when all args are described
[info] ArgHelper
[info] - should fail when all args are not described
[info] CumulativeSumTest1:
[info] A simple ranking cumulative sum job
[info] - should produce correct number of records when filtering out null values
[info] - should create correct ranking per group, 1st being the heighest person of that group
[info] A partitioned ranking cumulative sum job
[info] - should produce correct number of records when filtering out null values
[info] - should create correct ranking per group, 1st being the heighest person of that group
[info] SkewJoinPipeTest:
[info] A SkewInnerProductJob
[info] - should compute skew join with sampleRate = 0.001, using strategy A
[info] - should compute skew join with sampleRate = 0.001, using strategy B
[info] - should compute skew join with sampleRate = 0.1, using strategy A
[info] - should compute skew join with sampleRate = 0.1, using strategy B
[info] - should compute skew join with sampleRate = 0.9, using strategy A
[info] - should compute skew join with sampleRate = 0.9, using strategy B
[info] - should compute skew join with replication factor 5, using strategy A
[info] - should compute skew join with reducers = 10, using strategy A
[info] - should compute skew join with reducers = 10, using strategy B
[info] CollidingKeySkewJoinTest:
[info] A CollidingSkewInnerProductJob
[info] - should compute skew join with colliding fields, using strategy A
[info] - should compute skew join with colliding fields, using strategy B
[info] FieldImpsTest:
[info] Field
[info] - should contain manifest
[info] RichFields
[info] - should convert to Fields
[info] - should convert from Fields
[info] - should throw an exception on when converting a virtual Fields instance
[info] Fields conversions
[info] - should convert from ints
[info] - should convert from strings
[info] - should convert from symbols
[info] - should convert from com.twitter.scalding.Field instances
[info] - should convert from enumeration values
[info] - should convert from enumerations
[info] - should convert from general int tuples
[info] - should convert from general string tuples
[info] - should convert from general symbol tuples
[info] - should convert from general com.twitter.scalding.Field tuples
[info] - should convert from general enumeration value tuples
[info] - should convert to a pair of Fields from a pair of values
[info] - should correctly see if there are ints
[info] - should correctly determine default modes
[info] Matrix2OptimizationSpec:
[info] Matrix multiplication chain optimization
[info] - should handle a single matrix
[info] - should handle two matrices
[info] - should handle an example with 6 matrices
[info] - should not change an optimized plan
[info] - should change an unoptimized plan
[info] - should handle an optimized plan with sum
[info] - should handle an unoptimized plan with sum
[info] - should not break A*(B+C)
[info] - should handle an unoptimized global plan
[info] - should handle an optimized global plan
[info] - should handle a G^5 V plan
[info] - should handle an optimized G^5 V plan
[info] - should handle a G^8 plan
[info] CombinatoricsJobTest:
[info] A Combinatorics Job
[info] - should correctly compute 10 permute 3 equals 720
[info] - should correctly compute 5 choose 2 equals 10
[info] - should correctly compute 169 tuples that allow you to invest $1000 among the 4 given stocks
[info] - should correctly compute 101 non-zero tuples that allow you to invest $1000 among the 4 given stocks
[info] AlgebraJobTest:
[info] A AlgebraJob
[info] - should correctly do algebra
[info] A ComplicatedAlgebraJob
[info] - should correctly do complex algebra
[info] SideEffectTest:
[info] Zipper should do create zipped sequence. Coded with side effect
[info] - should correctly compute zipped sequence
[info] SideEffectBufferTest:
[info] ZipBuffer should do create two zipped sequences, one for even lines and one for odd lines. Coded with side effect
[info] - should correctly compute zipped sequence
[info] TypedDelimitedTest:
[info] A TypedTsv Source
[info] - should read and write data
[info] A TypedCsv Source
[info] - should read and write data
[info] A TypedPsv Source
[info] - should read and write data
[info] A TypedOsv Source
[info] - should read and write data
[info] A DailySuffixTypedTsv Source
[info] - should read and write data
[info] TemplateSourceTest:
[info] TemplatedTsv
[info] - should split output by template
[info] TypedTextTest:
[info] - Test with a flat tuple
[info] - Test with a nested tuple
[info] - Test with a raw type
[info] - Test with a tuple
[info] - Test with an Optional Int
[info] - Test with an Int
[info] MacroDepHygiene:
[info] TupleSetter macro
[info] - should work fine without any imports
[info] - should implicitly work fine without any imports
[info] - should fail if not a case class
[info] TupleConverter macro
[info] - should work fine without any imports
[info] - should implicitly work fine without any imports
[info] - should fail if not a case class
[info] PartitionedDelimitedTest:
[info] PartitionedDelimited
[info] - should write out CSVs
[info] XHandlerTest:
[info] Throwable classes
[info] - should be handled if exist in default mapping
[info] - should be handled if exist in custom mapping
[info] - should not be handled if missing in mapping
[info] - should be valid keys in mapping if defined
[info] - should create a URL link in GitHub wiki
[info] BlockJoinPipeTest:
[info] An InnerProductJob
[info] - should correctly compute product with 1 left block and 1 right block
[info] - should correctly compute product with multiple left and right blocks
[info] - should correctly compute product with a valid LeftJoin
[info] - should throw an exception when used with OuterJoin
[info] - should throw an exception when used with an invalid LeftJoin
[info] - should throw an exception when used with an invalid RightJoin
[info] FileSourceTest:
[info] A MultipleTsvFile Source
[info] - should take multiple Tsv files as input sources
[info] A WritableSequenceFile Source
[info] - should sequence file input
[info] - should writable sequence file input
[info] A MultipleTextLineFiles Source
[info] - should take multiple text files as input sources
[info] TextLine.toIterator
[info] - should correctly read strings
[info] default pathIsGood
[info] - should reject a non-existing directory
[info] - should accept a directory with data in it
[info] - should accept a directory with data and _SUCCESS in it
[info] - should accept a single directory without glob
[info] - should reject a single directory glob with ignored files
[info] - should reject a directory with only _SUCCESS when specified as a glob
[info] - should accept a directory with only _SUCCESS when specified without a glob
[info] FileSource.globHasSuccessFile
[info] - should accept a directory glob with only _SUCCESS
[info] - should accept a directory glob with _SUCCESS and other hidden files
[info] - should accept a directory glob with _SUCCESS and other non-hidden files
[info] - should reject a path without glob
[info] - should reject a multi-dir glob without _SUCCESS
[info] success file source pathIsGood
[info] - should reject a non-existing directory
[info] - should reject a directory with data in it but no _SUCCESS file
[info] - should reject a single directory without glob
[info] - should reject a single directory glob with only _SUCCESS and ignored files
[info] - should accept a directory with data and _SUCCESS in it when specified as a glob
[info] - should reject a directory with data and _SUCCESS in it when specified without a glob
[info] - should reject a directory with only _SUCCESS when specified as a glob
[info] - should reject a directory with only _SUCCESS when specified without a glob
[info] - should reject a multi-dir glob with only one _SUCCESS
[info] - should accept a multi-dir glob if every dir has _SUCCESS
[info] - should accept a multi-dir glob if all dirs with non-hidden files have _SUCCESS while dirs with hidden ones don't
[info] - should accept a multi-dir glob if all dirs with non-hidden files have _SUCCESS while other dirs are empty or don't exist
[info] FixedPathSource.hdfsWritePath
[info] - should crib if path == *
[info] - should crib if path == /*
[info] - should remove /* from a path ending in /*
[info] - should leave path as-is when it ends in a directory name
[info] - should leave path as-is when it ends in a directory name/
[info] - should leave path as-is when it ends in * without a preceeding /
[info] invalid source input
[info] - should Throw in validateTaps in strict mode
[info] - should Throw in validateTaps in non-strict mode
[info] - should Throw in toIterator because no data is present in strict mode
[info] - should Throw in toIterator because no data is present in non-strict mode
[info] TupleTest:
[info] TupleConverters
[info] - should TupleGetter should work as a type-class
[info] - should get primitives out of cascading tuples
[info] - should get non-primitives out of cascading tuples
[info] - should deal with AnyRef
[info] WrappedJoinerTest:
[info] Methods called from a Joiner
[info] - should have access to a FlowProcess when WrappedJoiner is used
[info] - should have no access to a FlowProcess when WrappedJoiner is not used
[info] NoStackLineNumberTest:
[info] No Stack Shouldn't block getting line number info
[info] - should actually get the no stack info
[info] ReferencedClassFinderTest:
[info] JobClassFinder
[info] - should Identify and tokenize used case classes
[info] - should Run successfully
[info] ScalaTest
[info] Run completed in 3 minutes, 31 seconds.
[info] Total number of tests run: 599
[info] Suites: completed 165, aborted 0
[info] Tests: succeeded 599, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 606, Failed 0, Errors 0, Passed 606
[info] Compiling 93 Scala sources and 93 Java sources to /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/Operation.scala:110: [UnextendedSealedTrait] This sealed trait is never extended
[warn] sealed trait Operation extends ThriftEnum with Serializable
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/NumberEnum.scala:98: [UnextendedSealedTrait] This sealed trait is never extended
[warn] sealed trait NumberEnum extends ThriftEnum with Serializable
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/NumberEnumWithMoreValue.scala:110: [UnextendedSealedTrait] This sealed trait is never extended
[warn] sealed trait NumberEnumWithMoreValue extends ThriftEnum with Serializable
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/ABool.scala:317: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/ALong.scala:317: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/AString.scala:324: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/AStructThatLooksLikeUnionV2.scala:483: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/Address.scala:416: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/AddressWithStreetWithDefaultRequirement.scala:416: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/ListNestEnum.scala:350: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/ListNestMap.scala:1356: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/ListNestSet.scala:1242: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/MapNestList.scala:1629: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/MapNestMap.scala:1533: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/MapNestSet.scala:1341: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/Name.scala:418: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/NestedList.scala:1440: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/Operation.scala:72: [UseGetOrElseNotPatMatch] ... match { Some(x) => x; None => Operation.this.EnumUnknownOperation.apply(value)} can be replaced with .getOrElse(Operation.this.EnumUnknownOperation.apply(value))
[warn] get(value) match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/Phone.scala:409: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/RequiredListFixture.scala:444: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/RequiredMapFixture.scala:450: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/RequiredPrimitiveFixture.scala:934: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/RequiredSetFixture.scala:433: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/SetNestList.scala:1116: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/SetNestMap.scala:1923: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/SetNestSet.scala:1050: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/StringAndBinary.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/StructWithAStructThatLooksLikeUnionV2.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/StructWithExtraField.scala:411: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/StructWithIndexStartsFrom4.scala:318: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/StructWithReorderedOptionalFields.scala:501: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/StructWithUnionV2.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestFieldOfEnum.scala:408: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestListPrimitive.scala:1070: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestListsInMap.scala:505: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestMapBinary.scala:356: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestMapComplex.scala:356: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestMapPrimitiveKey.scala:1112: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestMapPrimitiveValue.scala:1112: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestOptionalMap.scala:1077: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestPerson.scala:578: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestPersonWithAllInformation.scala:1011: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestPersonWithRequiredPhone.scala:674: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestSetPrimitive.scala:993: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/binary/StringAndBinary.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/ABool.scala:317: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/ALong.scala:317: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/AString.scala:324: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/AStructThatLooksLikeUnionV2.scala:483: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/AddRequiredStructV1.scala:423: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/DefaultStructV1.scala:317: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/EmptyStruct.scala:232: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/ListOfUnions.scala:470: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/ListStructV1.scala:350: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/ListStructV2.scala:350: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapAddRequiredStructV1.scala:356: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapStructV1.scala:356: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapStructV2.scala:356: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapValueStructV1.scala:356: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapValueStructV2.scala:356: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapWithPrimMapValue.scala:394: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapWithStructMapValue.scala:394: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapWithStructValue.scala:356: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapWithUnionKey.scala:482: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/MapWithUnionValue.scala:482: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/NestedEmptyStruct.scala:406: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/NumberEnum.scala:63: [UseGetOrElseNotPatMatch] ... match { Some(x) => x; None => NumberEnum.this.EnumUnknownNumberEnum.apply(value)} can be replaced with .getOrElse(NumberEnum.this.EnumUnknownNumberEnum.apply(value))
[warn] get(value) match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/NumberEnumWithMoreValue.scala:72: [UseGetOrElseNotPatMatch] ... match { Some(x) => x; None => NumberEnumWithMoreValue.this.EnumUnknownNumberEnumWithMoreValue.apply(value)} can be replaced with .getOrElse(NumberEnumWithMoreValue.this.EnumUnknownNumberEnumWithMoreValue.apply(value))
[warn] get(value) match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/OptionalInsideRequired.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/OptionalStructV1.scala:319: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/RenameStructV1.scala:324: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/RequiredInsideOptional.scala:412: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/SetStructV1.scala:339: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/SetStructV2.scala:339: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructV1.scala:324: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructV2.scala:418: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructV3.scala:506: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructV4WithExtracStructField.scala:588: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructWithAStructThatLooksLikeUnionV2.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructWithEnum.scala:319: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructWithMoreEnum.scala:319: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructWithNestedUnion.scala:1233: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructWithOptionalUnionOfStructs.scala:412: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructWithRequiredUnionOfStructs.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructWithUnionOfStructs.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructWithUnionV1.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/StructWithUnionV2.scala:417: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/TypeChangeStructV1.scala:317: [UseOptionFlatMapNotPatMatch] ... match { Some(x) => {
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] }; None => None} can be replaced with .flatMap({
[warn] val _data: Array[Byte] = java.util.Arrays.copyOfRange(_buff.getArray(), 0, _buff.length());
[warn] scala.Some.apply[com.twitter.scrooge.TFieldBlob](com.twitter.scrooge.TFieldBlob.apply(_field, _data))
[warn] })
[warn] _fieldOpt match {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/ABool.scala:61: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: ABool): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/ALong.scala:61: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: ALong): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/AStructThatLooksLikeUnionV2.scala:87: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: AStructThatLooksLikeUnionV2): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/Phone.scala:74: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: Phone): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/StructWithReorderedOptionalFields.scala:87: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: StructWithReorderedOptionalFields): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestListsInMap.scala:74: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: TestListsInMap): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/TestOptionalMap.scala:139: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: TestOptionalMap): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/ABool.scala:61: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: ABool): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/ALong.scala:61: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: ALong): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/AStructThatLooksLikeUnionV2.scala:87: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: AStructThatLooksLikeUnionV2): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/DefaultStructV1.scala:61: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: DefaultStructV1): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/EmptyStruct.scala:48: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: EmptyStruct): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/EmptyStruct.scala:51: [UnusedParameter] Parameter original is not used in method withoutPassthroughFields.
[warn] def withoutPassthroughFields(original: EmptyStruct): EmptyStruct =
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/EmptyStruct.scala:135: [UnusedParameter] Parameter _item is not used in method unapply.
[warn] def unapply(_item: EmptyStruct): Boolean = true
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/OptionalStructV1.scala:61: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: OptionalStructV1): Unit = {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_scala/test/compat/TypeChangeStructV1.scala:61: [UnusedParameter] Parameter _item is not used in method validate.
[warn] def validate(_item: TypeChangeStructV1): Unit = {
[warn] ^
[warn] 104 warnings found
[warn] bootstrap class path not set in conjunction with -source 1.6
[info] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_java/test/ListNestEnum.java: Some input files use unchecked or unsafe operations.
[info] /Users/geri/work/scalding/scalding-parquet-scrooge-fixtures/target/scala-2.11/src_managed/test/thrift/com/twitter/scalding/parquet/scrooge/thrift_java/test/ListNestEnum.java: Recompile with -Xlint:unchecked for details.
[info] Compiling 4 Scala sources to /Users/geri/work/scalding/scalding-thrift-macros/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/test/scala/com/twitter/scalding/thrift/macros/PlatformTest.scala:86: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] toScroogeInternalOrderedSerialization[TestUnion]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-thrift-macros/src/test/scala/com/twitter/scalding/thrift/macros/PlatformTest.scala:87: [IdenticalIfElseCondition] This condition has appeared earlier in the if-else chain and will never hold here. (except for side-effecting conditions)
[warn] runCompareTest[TestUnion](toScroogeInternalOrderedSerialization[TestUnion], arbitraryInstanceProvider[TestUnion])
[warn] ^
[warn] two warnings found
[info] Compiling 4 Scala sources and 2 Java sources to /Users/geri/work/scalding/scalding-parquet/target/scala-2.11/test-classes...
[warn] /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/tuple/TypedParquetTupleTest.scala:22: [YodaConditions] Yoda conditions using you are.
[warn] .sink[SampleClassB](TypedParquet[SampleClassB](Seq("output1"))) {
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/tuple/TypedParquetTupleTest.scala:82: [YodaConditions] Yoda conditions using you are.
[warn] val input = TypedParquet[SampleClassC](inputPath, fp)
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/tuple/macros/MacroUnitTests.scala:208: [YodaConditions] Yoda conditions using you are.
[warn] val readSupport = Macros.caseClassParquetReadSupport[SampleClassE]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/tuple/macros/MacroUnitTests.scala:239: [YodaConditions] Yoda conditions using you are.
[warn] val readSupport = Macros.caseClassParquetReadSupport[SampleClassB]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/tuple/macros/MacroUnitTests.scala:258: [YodaConditions] Yoda conditions using you are.
[warn] val readSupport = Macros.caseClassParquetReadSupport[SampleClassF]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/tuple/macros/MacroUnitTests.scala:285: [YodaConditions] Yoda conditions using you are.
[warn] val readSupport = Macros.caseClassParquetReadSupport[SampleClassH]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/tuple/macros/MacroUnitTests.scala:310: [YodaConditions] Yoda conditions using you are.
[warn] val readSupport = Macros.caseClassParquetReadSupport[SampleClassK]
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/ParquetSourcesTests.scala:37: method withColumns in trait HasColumnProjection is deprecated: Use withColumnProjections, which uses a different glob syntax
[warn] assert(src.withColumns === Set())
[warn] ^
[warn] /Users/geri/work/scalding/scalding-parquet/src/test/scala/com/twitter/scalding/parquet/tuple/macros/MacroUnitTests.scala:30: type MockitoSugar in package mock is deprecated: Please use org.scalatest.mockito.MockitoSugar instead
[warn] class MacroUnitTests extends WordSpec with Matchers with MockitoSugar {
[warn] ^
[warn] 9 warnings found
[warn] bootstrap class path not set in conjunction with -source 1.6
[info] /Users/geri/work/scalding/scalding-parquet/src/test/java/com/twitter/scalding/parquet/thrift/TestParquetTBaseScheme.java: Some input files use or override a deprecated API.
[info] /Users/geri/work/scalding/scalding-parquet/src/test/java/com/twitter/scalding/parquet/thrift/TestParquetTBaseScheme.java: Recompile with -Xlint:deprecation for details.
[info] /Users/geri/work/scalding/scalding-parquet/src/test/java/com/twitter/scalding/parquet/thrift/TestParquetTBaseScheme.java: Some input files use unchecked or unsafe operations.
[info] /Users/geri/work/scalding/scalding-parquet/src/test/java/com/twitter/scalding/parquet/thrift/TestParquetTBaseScheme.java: Recompile with -Xlint:unchecked for details.
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
log4j:WARN No appenders could be found for logger (cascading.property.AppProps).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
[info] JsonLineTest:
[info] A JsonLine sink
[info] - should not stringify lists or numbers and not escape single quotes
[info] - should only sink requested fields
[info] - should read json line input
[info] - should handle missing fields
[info] - should handle nested fields
[info] - should fail on empty lines by default
[info] - should handle empty lines when `failOnEmptyLines` is set to false
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] JDBCSourceCompileTest:
[info] JDBCSource
[info] - should Pick up correct column definitions for MySQL Driver
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] JdbcMacroUnitTests:
[info] - String field missing annotation
[info] - String field size annotation not in range
[info] - Int field size annotation not in range
[info] - Option field with default
[info] - Unknown field type
[info] - Annotation for size doesn't use a constant
[info] - Nested options should be blocked
[info] - Extra annotation not supported on current field
[info] - Two annotations of the same type
[info] Produces the ColumnDefinition
[info] Produces the ColumnDefinition for nested case class
[info] Produces the DBTypeDescriptor
[info] interoperates with Vertica, which uses different type names
[info] Big Jdbc Test
[info] TupleConverter for Date
[info] ResultSetExtractor validation for nullable columns
[info] ResultSetExtractor when nullable values are not null
[info] ResultSetExtractor when null values
[info] - ResultSetExtractor for DB schema type mismatch
[info] - ResultSetExtractor for DB schema nullable mismatch
[info] - Duplicate nested fields should be blocked
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] ReplTest:
[info] REPL in Local mode
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
[info] - should save -- TypedPipe[String]
[info] should snapshot
[info] - should only -- TypedPipe[String]
[info] - should can be mapped and saved -- TypedPipe[String]
[info] - should tuples -- TypedPipe[(String,Int)]
[info] should grouped -- Grouped[String,String] which
[info] - is explicit
[info] - is implicit
[info] should joined -- CoGrouped[String, Long] which
[info] - is explicit
[info] - is implicit
[info] - should support toOption on ValuePipe
[info] - should reset flow
[info] - should run entire flow
[info] should TypedPipe of a TextLine
[info] - should support toIterator
[info] - should support toList
[info] should toIterator should generate a snapshot for TypedPipe with
[info] - should flatMap
[info] - should tuple
[info] REPL in Hadoop mode
[info] - should save -- TypedPipe[String]
[info] should snapshot
[info] - should only -- TypedPipe[String]
[info] - should can be mapped and saved -- TypedPipe[String]
[info] - should tuples -- TypedPipe[(String,Int)]
[info] should grouped -- Grouped[String,String] which
[info] - is explicit
[info] - is implicit
[info] should joined -- CoGrouped[String, Long] which
[info] - is explicit
[info] - is implicit
[info] - should support toOption on ValuePipe
[info] - should reset flow
[info] - should run entire flow
[info] should TypedPipe of a TextLine
[info] - should support toIterator
[info] - should support toList
[info] should toIterator should generate a snapshot for TypedPipe with
[info] - should flatMap
[info] - should tuple
[info] findAllUpPath
[info] - should enumerate matching files
The directory '/var/folders/7t/9l0n105n4_jbrq076r887pthz96c2m/T/scalding-repl2538171603631844330' could not be accessed while looking for 'this_matches'
[info] - should ignore directories with restricted permissions
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] HRavenHistoryServiceTest:
[info] A HRaven history service
log4j:WARN No appenders could be found for logger (com.twitter.hraven.rest.client.HRavenRestClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
[info] - should work as HRaven memory history service
[info] - should work as HRaven reducer history service
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] RatioBasedReducerEstimatorTest:
Formatting using clusterid: testClusterID
Aug 14, 2017 4:39:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices as a root resource class
Aug 14, 2017 4:39:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:39:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:39:38 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:39:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:39 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:39 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices to GuiceManagedComponentProvider with the scope "PerRequest"
Aug 14, 2017 4:39:39 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:39:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:39:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices as a root resource class
Aug 14, 2017 4:39:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:39:40 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:39:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:40 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:39:41 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:39:41 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:39:41 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:39:41 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:39:41 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:41 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:41 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:41 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:39:42 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:39:42 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:39:42 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:39:42 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:39:42 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:42 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:42 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:42 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:39:43 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:39:43 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:39:43 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:39:43 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:39:43 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:43 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:43 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:39:43 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
[info] Single-step job with ratio-based reducer estimator
2017-08-14 16:39:48,562 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:39:48,568 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:39:48,574 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] property.AppProps (AppProps.java:getAppID(169)) - using app.id: 50A7646A45004324A1FEEA33C71E8159
2017-08-14 16:39:48,635 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:39:48,772 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Version (Version.java:printBanner(78)) - Concurrent, Inc - Cascading 2.6.1
2017-08-14 16:39:48,775 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:39:48,776 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:39:48,776 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:39:48,777 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:39:48,777 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:39:48,777 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:39:48,797 WARN [pool-49-thread-1] reducer_estimation.EmptyHistoryBasedEstimator (Estimator.scala:estimate(53)) - No matching history found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:39:48,800 INFO [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:39:50,262 INFO [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local502434416_0001
2017-08-14 16:39:50,262 INFO [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:39:50,475 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:39:50,535 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:39:50,536 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:39:50,584 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:39:50,585 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)e9093a6f-97ee-47c0-874e-85dfefea2878)[by:[{1}:'key']]
2017-08-14 16:39:50,782 INFO [pool-52-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:39:50,782 INFO [pool-52-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:39:50,799 INFO [pool-52-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)e9093a6f-97ee-47c0-874e-85dfefea2878)[by:[{1}:'key']]
2017-08-14 16:39:50,799 INFO [pool-52-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:39:55,311 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should not set reducers when no history is found
2017-08-14 16:39:55,432 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:39:55,432 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:39:55,438 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:39:55,451 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:39:55,452 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:39:55,452 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:39:55,453 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:39:55,453 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:39:55,453 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:39:55,458 WARN [pool-53-thread-1] reducer_estimation.ErrorHistoryBasedEstimator (Estimator.scala:estimate(61)) - Unable to fetch history in class com.twitter.scalding.reducer_estimation.ErrorHistoryBasedEstimator
java.lang.RuntimeException: Failed to fetch job history
at com.twitter.scalding.reducer_estimation.ErrorHistoryService$.fetchHistory(RatioBasedEstimatorTest.scala:29)
at com.twitter.scalding.estimation.HistoryEstimator$class.estimate(Estimator.scala:51)
at com.twitter.scalding.reducer_estimation.RatioBasedEstimator.estimate(RatioBasedEstimator.scala:18)
at com.twitter.scalding.reducer_estimation.ReducerEstimatorStepStrategy$$anonfun$estimate$1.apply(ReducerEstimatorStepStrategy.scala:78)
at com.twitter.scalding.reducer_estimation.ReducerEstimatorStepStrategy$$anonfun$estimate$1.apply(ReducerEstimatorStepStrategy.scala:67)
at scala.Option.foreach(Option.scala:257)
at com.twitter.scalding.reducer_estimation.ReducerEstimatorStepStrategy$.estimate(ReducerEstimatorStepStrategy.scala:67)
at com.twitter.scalding.reducer_estimation.ReducerEstimatorStepStrategy$.apply(ReducerEstimatorStepStrategy.scala:44)
at cascading.flow.planner.FlowStepJob.applyFlowStepConfStrategy(FlowStepJob.java:187)
at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:148)
at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:39:55,461 INFO [pool-53-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:39:56,622 INFO [pool-53-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1139533503_0002
2017-08-14 16:39:56,622 INFO [pool-53-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:39:56,747 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:39:56,766 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:39:56,767 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:39:56,775 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:39:56,775 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)c899b4bc-bb89-4edc-8643-e96ec949850f)[by:[{1}:'key']]
2017-08-14 16:39:56,838 INFO [pool-56-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:39:56,839 INFO [pool-56-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:39:56,845 INFO [pool-56-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)c899b4bc-bb89-4edc-8643-e96ec949850f)[by:[{1}:'key']]
2017-08-14 16:39:56,845 INFO [pool-56-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:01,649 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should not set reducers when error fetching history
2017-08-14 16:40:01,700 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:40:01,700 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:40:01,706 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:01,716 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:40:01,716 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:01,716 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:01,716 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:40:01,716 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:40:01,716 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:40:01,720 INFO [pool-57-thread-1] reducer_estimation.ValidHistoryBasedEstimator (Estimator.scala:estimate(56)) - 4 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:40:01,729 WARN [pool-57-thread-1] reducer_estimation.ValidHistoryBasedEstimator (RatioBasedEstimator.scala:com$twitter$scalding$reducer_estimation$RatioBasedEstimator$$acceptableInputRatio(31)) - Input sizes differ too much to use for estimation: current: 2496, past: 10
2017-08-14 16:40:01,733 INFO [pool-57-thread-1] reducer_estimation.ValidHistoryBasedEstimator (RatioBasedEstimator.scala:estimate(65)) - Getting base estimate from InputSizeReducerEstimator
2017-08-14 16:40:01,734 INFO [pool-57-thread-1] reducer_estimation.InputSizeReducerEstimator$ (InputSizeReducerEstimator.scala:estimateReducersWithoutRounding(50)) -
InputSizeReducerEstimator
- input size (bytes): 2496
- reducer estimate: 2.4375
- Breakdown:
- Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"] 2496
2017-08-14 16:40:01,736 INFO [pool-57-thread-1] reducer_estimation.ValidHistoryBasedEstimator (RatioBasedEstimator.scala:apply$mcID$sp(71)) -
RatioBasedEstimator
- past reducer ratio: 0.5
- reducer estimate: 2
2017-08-14 16:40:01,736 INFO [pool-57-thread-1] reducer_estimation.ValidHistoryBasedEstimator (Estimator.scala:estimate(58)) - class com.twitter.scalding.reducer_estimation.ValidHistoryBasedEstimator estimate: Some(2)
2017-08-14 16:40:01,737 INFO [pool-57-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:40:02,806 INFO [pool-57-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1832219451_0003
2017-08-14 16:40:02,806 INFO [pool-57-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:40:02,935 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:40:02,958 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:40:02,958 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:40:02,964 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:02,964 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)c899b4bc-bb89-4edc-8643-e96ec949850f)[by:[{1}:'key']]
2017-08-14 16:40:03,016 INFO [pool-60-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:03,016 INFO [pool-60-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:03,020 INFO [pool-60-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)c899b4bc-bb89-4edc-8643-e96ec949850f)[by:[{1}:'key']]
2017-08-14 16:40:03,020 INFO [pool-60-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:03,072 INFO [pool-60-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:03,072 INFO [pool-60-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:03,077 INFO [pool-60-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)c899b4bc-bb89-4edc-8643-e96ec949850f)[by:[{1}:'key']]
2017-08-14 16:40:03,077 INFO [pool-60-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:07,825 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should set reducers correctly when there is valid history
2017-08-14 16:40:07,877 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:40:07,878 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:40:07,884 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:07,895 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:40:07,896 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:07,896 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:07,896 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:40:07,896 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:40:07,896 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:40:07,898 INFO [pool-61-thread-1] reducer_estimation.SmallDataExplosionHistoryBasedEstimator (Estimator.scala:estimate(56)) - 3 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:40:07,898 INFO [pool-61-thread-1] reducer_estimation.SmallDataExplosionHistoryBasedEstimator (RatioBasedEstimator.scala:estimate(65)) - Getting base estimate from InputSizeReducerEstimator
2017-08-14 16:40:07,900 INFO [pool-61-thread-1] reducer_estimation.InputSizeReducerEstimator$ (InputSizeReducerEstimator.scala:estimateReducersWithoutRounding(50)) -
InputSizeReducerEstimator
- input size (bytes): 2496
- reducer estimate: 0.002
- Breakdown:
- Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"] 2496
2017-08-14 16:40:07,900 INFO [pool-61-thread-1] reducer_estimation.SmallDataExplosionHistoryBasedEstimator (RatioBasedEstimator.scala:apply$mcID$sp(71)) -
RatioBasedEstimator
- past reducer ratio: 1000.0
- reducer estimate: 2
2017-08-14 16:40:07,900 INFO [pool-61-thread-1] reducer_estimation.SmallDataExplosionHistoryBasedEstimator (Estimator.scala:estimate(58)) - class com.twitter.scalding.reducer_estimation.SmallDataExplosionHistoryBasedEstimator estimate: Some(2)
2017-08-14 16:40:07,901 INFO [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:40:08,953 INFO [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1372161553_0004
2017-08-14 16:40:08,953 INFO [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:40:09,082 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:40:09,115 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:40:09,116 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:40:09,127 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:09,127 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)c899b4bc-bb89-4edc-8643-e96ec949850f)[by:[{1}:'key']]
2017-08-14 16:40:09,165 INFO [pool-64-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:09,165 INFO [pool-64-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:09,172 INFO [pool-64-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)c899b4bc-bb89-4edc-8643-e96ec949850f)[by:[{1}:'key']]
2017-08-14 16:40:09,172 INFO [pool-64-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:09,206 INFO [pool-64-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:09,206 INFO [pool-64-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:09,211 INFO [pool-64-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)c899b4bc-bb89-4edc-8643-e96ec949850f)[by:[{1}:'key']]
2017-08-14 16:40:09,211 INFO [pool-64-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:13,967 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should handle mapper output explosion over small data correctly
2017-08-14 16:40:14,010 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:40:14,010 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:40:14,015 INFO [pool-1-thread-1-ScalaTest-running-RatioBasedReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:14,029 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:40:14,029 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:14,029 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:14,029 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:40:14,030 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:40:14,030 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:40:14,032 INFO [pool-65-thread-1] reducer_estimation.InvalidHistoryBasedEstimator (Estimator.scala:estimate(56)) - 3 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:40:14,032 WARN [pool-65-thread-1] reducer_estimation.InvalidHistoryBasedEstimator (RatioBasedEstimator.scala:com$twitter$scalding$reducer_estimation$RatioBasedEstimator$$acceptableInputRatio(31)) - Input sizes differ too much to use for estimation: current: 2496, past: 10
2017-08-14 16:40:14,032 WARN [pool-65-thread-1] reducer_estimation.InvalidHistoryBasedEstimator (RatioBasedEstimator.scala:com$twitter$scalding$reducer_estimation$RatioBasedEstimator$$acceptableInputRatio(31)) - Input sizes differ too much to use for estimation: current: 2496, past: 10
2017-08-14 16:40:14,033 WARN [pool-65-thread-1] reducer_estimation.InvalidHistoryBasedEstimator (RatioBasedEstimator.scala:com$twitter$scalding$reducer_estimation$RatioBasedEstimator$$acceptableInputRatio(31)) - Input sizes differ too much to use for estimation: current: 2496, past: 10
2017-08-14 16:40:14,033 WARN [pool-65-thread-1] reducer_estimation.InvalidHistoryBasedEstimator (RatioBasedEstimator.scala:estimate(61)) - No matching history found within input ratio threshold: 0.1
2017-08-14 16:40:14,033 INFO [pool-65-thread-1] reducer_estimation.InvalidHistoryBasedEstimator (Estimator.scala:estimate(58)) - class com.twitter.scalding.reducer_estimation.InvalidHistoryBasedEstimator estimate: None
2017-08-14 16:40:14,033 INFO [pool-65-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:40:15,011 INFO [pool-65-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1097143188_0005
2017-08-14 16:40:15,011 INFO [pool-65-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:40:15,137 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:40:15,155 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:40:15,155 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:40:15,161 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:15,161 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)b6d02049-e29f-4370-b4ee-3b2fa6b263e6)[by:[{1}:'key']]
2017-08-14 16:40:15,206 INFO [pool-68-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:15,206 INFO [pool-68-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:15,211 INFO [pool-68-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)b6d02049-e29f-4370-b4ee-3b2fa6b263e6)[by:[{1}:'key']]
2017-08-14 16:40:15,212 INFO [pool-68-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:20,014 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should not set reducers when there is no valid history
2017-08-14 16:40:20,883 ERROR [ResourceManager Event Processor] resourcemanager.ResourceManager (ResourceManager.java:run(594)) - Returning, interrupted : java.lang.InterruptedException
2017-08-14 16:40:20,884 ERROR [Thread[Thread-333,5,main]] delegation.AbstractDelegationTokenSecretManager (AbstractDelegationTokenSecretManager.java:run(552)) - ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted
2017-08-14 16:40:20,994 ERROR [Thread[Thread-312,5,main]] delegation.AbstractDelegationTokenSecretManager (AbstractDelegationTokenSecretManager.java:run(552)) - ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted
[info] RuntimeReducerEstimatorTest:
Formatting using clusterid: testClusterID
Aug 14, 2017 4:40:35 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices as a root resource class
Aug 14, 2017 4:40:35 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:40:35 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:40:35 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:40:35 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:35 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:35 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices to GuiceManagedComponentProvider with the scope "PerRequest"
Aug 14, 2017 4:40:35 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:40:37 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:40:37 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices as a root resource class
Aug 14, 2017 4:40:37 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:40:37 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:40:37 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:37 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:37 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:37 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:40:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:40:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:40:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:40:38 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:40:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:38 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:38 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:40:39 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:40:39 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:40:39 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:40:39 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:40:39 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:39 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:39 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:39 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:40:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:40:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:40:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:40:40 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:40:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:40 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:40:40 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
[info] Single-step job with runtime-based reducer estimator
2017-08-14 16:40:43,612 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:40:43,613 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:40:43,618 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:43,629 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:40:43,630 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:43,630 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:43,630 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:40:43,630 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:40:43,630 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:40:43,639 INFO [pool-112-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$2 (Estimator.scala:estimate(56)) - 3 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:40:43,644 INFO [pool-112-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$2 (RuntimeReducerEstimator.scala:estimate(153)) -
History items have the following numbers of tasks:
List(3, 3, 3),
and the following numbers of tasks have valid task histories:
List(3, 3, 3)
2017-08-14 16:40:43,650 INFO [pool-112-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$2 (RuntimeReducerEstimator.scala:estimate(187)) -
- HDFS bytes read: List(4992, 1248, 9984)
- Time-to-byte-ratios: List(0.6009615384615384, 0.4807692307692308, 0.7211538461538461)
- Typical type-to-byte-ratio: Some(0.6009615384615384)
- Desired runtime: 25
- Input bytes: 2496
- Estimate: Some(60)
2017-08-14 16:40:43,651 INFO [pool-112-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$2 (Estimator.scala:estimate(58)) - class com.twitter.scalding.reducer_estimation.RuntimeReducerEstimator$$anon$2 estimate: Some(60)
2017-08-14 16:40:43,651 INFO [pool-112-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:40:44,734 INFO [pool-112-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1013442709_0006
2017-08-14 16:40:44,734 INFO [pool-112-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:40:44,865 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:40:44,882 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:40:44,883 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:40:44,887 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:44,887 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:44,965 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:44,965 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:44,968 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:44,968 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:44,996 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:44,996 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,004 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,004 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,044 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,045 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,051 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,051 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,071 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,071 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,075 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,075 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,105 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,105 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,108 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,109 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,132 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,132 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,135 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,135 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,161 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,161 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,164 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,164 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,186 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,187 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,191 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,191 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,218 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,218 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,222 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,222 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,265 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,265 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,269 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,269 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,294 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,294 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,298 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,299 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,335 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,335 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,340 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,340 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,407 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,407 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,411 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,411 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,448 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,448 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,453 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,453 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,488 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,488 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,492 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,492 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,523 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,523 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,532 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,532 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,562 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,562 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,572 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,572 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,607 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,607 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,610 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,610 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,654 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,654 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,658 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,658 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,698 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,698 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,702 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,702 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,736 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,736 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,740 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,741 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,774 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,774 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,779 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,779 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,805 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,805 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,813 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,814 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,848 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,848 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,857 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,858 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,887 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,887 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,892 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,892 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,924 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,924 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,928 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,929 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,957 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,957 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,964 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,965 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:45,988 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:45,988 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:45,991 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:45,992 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,023 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,023 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,027 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,028 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,057 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,057 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,065 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,065 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,096 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,096 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,100 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,100 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,142 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,143 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,147 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,147 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,182 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,183 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,188 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,188 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,211 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,211 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,215 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,215 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,239 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,239 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,241 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,242 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,260 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,261 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,264 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,264 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,289 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,289 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,293 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,293 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,312 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,312 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,315 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,315 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,339 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,340 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,342 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,343 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,355 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,355 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,359 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,359 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,385 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,385 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,391 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,391 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,414 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,414 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,417 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,417 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,447 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,447 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,450 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,450 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,472 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,473 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,477 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,477 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,506 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,506 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,511 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,511 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,534 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,534 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,538 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,538 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,566 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,566 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,569 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,570 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,591 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,591 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,595 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,595 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,611 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,611 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,615 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,616 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,637 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,637 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,640 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,640 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,667 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,667 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,670 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,670 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,692 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,693 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,698 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,698 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,728 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,728 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,732 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,733 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,755 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,755 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,759 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,759 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,786 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,786 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,791 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,791 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,816 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,816 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,819 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,819 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,846 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,846 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,850 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,850 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,871 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,871 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,876 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,876 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,903 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,904 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,907 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,907 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:46,930 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:46,930 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:46,935 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:46,935 INFO [pool-115-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:49,759 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should set reducers correctly with median estimation scheme
2017-08-14 16:40:49,805 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:40:49,806 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:40:49,810 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:49,819 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:40:49,819 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:49,819 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:49,819 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:40:49,819 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:40:49,819 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:40:49,821 INFO [pool-116-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$2 (Estimator.scala:estimate(56)) - 3 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:40:49,822 INFO [pool-116-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$2 (RuntimeReducerEstimator.scala:estimate(153)) -
History items have the following numbers of tasks:
List(3, 3, 3),
and the following numbers of tasks have valid task histories:
List(3, 3, 3)
2017-08-14 16:40:49,823 INFO [pool-116-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$2 (RuntimeReducerEstimator.scala:estimate(187)) -
- HDFS bytes read: List(4992, 1248, 9984)
- Time-to-byte-ratios: List(0.8032852564102564, 0.48878205128205127, 0.5418669871794872)
- Typical type-to-byte-ratio: Some(0.6113114316239316)
- Desired runtime: 25
- Input bytes: 2496
- Estimate: Some(62)
2017-08-14 16:40:49,823 INFO [pool-116-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$2 (Estimator.scala:estimate(58)) - class com.twitter.scalding.reducer_estimation.RuntimeReducerEstimator$$anon$2 estimate: Some(62)
2017-08-14 16:40:49,823 INFO [pool-116-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:40:50,953 INFO [pool-116-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1322722949_0007
2017-08-14 16:40:50,953 INFO [pool-116-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:40:51,091 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:40:51,172 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:40:51,172 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,177 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:51,177 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,286 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,286 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,290 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,290 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,316 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,316 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,320 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,320 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,341 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,341 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,345 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,345 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,374 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,374 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,379 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,379 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,405 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,405 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,408 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,408 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,434 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,434 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,438 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,438 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,451 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,451 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,453 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,453 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,725 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,725 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,733 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,733 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,771 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,771 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,775 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,775 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,803 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,803 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,807 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,807 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,837 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,837 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,841 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,841 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,857 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,858 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,865 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,865 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,896 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,897 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,901 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,901 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,928 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,928 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,935 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,936 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,957 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,958 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,960 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,960 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:51,983 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:51,983 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:51,987 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:51,987 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,006 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,006 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,009 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,009 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,032 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,032 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,035 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,036 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,056 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,056 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,058 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,058 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,085 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,085 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,088 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,089 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,114 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,114 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,118 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,119 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,144 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,144 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,150 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,150 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,170 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,170 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,172 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,173 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,202 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,202 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,204 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,204 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,222 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,222 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,225 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,226 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,247 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,247 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,250 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,251 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,266 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,266 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,268 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,269 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,290 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,290 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,292 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,292 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,304 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,304 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,306 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,307 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,323 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,323 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,326 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,327 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,342 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,342 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,346 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,346 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,364 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,364 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,366 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,366 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,393 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,393 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,398 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,398 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,416 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,416 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,418 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,419 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,440 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,440 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,447 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,447 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,468 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,468 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,471 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,471 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,502 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,502 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,505 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,505 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,523 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,523 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,527 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,527 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,552 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,552 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,555 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,556 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,578 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,579 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,584 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,584 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,609 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,609 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,611 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,611 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,630 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,630 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,634 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,634 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,657 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,657 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,661 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,661 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,681 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,681 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,685 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,685 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,715 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,716 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,719 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,719 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,737 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,737 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,741 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,742 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,781 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,782 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,788 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,788 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,821 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,822 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,825 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,826 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,865 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,865 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,869 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,869 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,900 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,900 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,904 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,904 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,937 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,938 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,947 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,947 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:52,986 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:52,987 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:52,994 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:52,997 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,021 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,021 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,024 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,024 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,052 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,052 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,055 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,055 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,074 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,074 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,078 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,078 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,110 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,110 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,114 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,114 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,137 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,137 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,141 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,141 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,173 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,173 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,177 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,178 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,200 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,200 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,202 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,203 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,230 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,230 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,234 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,235 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,256 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,256 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,259 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,259 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:53,279 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:53,279 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:53,284 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)8aa8db1f-fecc-423d-8c1f-63c277f1e085)[by:[{1}:'key']]
2017-08-14 16:40:53,284 INFO [pool-119-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:55,970 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should set reducers correctly with mean estimation scheme
2017-08-14 16:40:56,012 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:40:56,013 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:40:56,017 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:56,031 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:40:56,031 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:56,031 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:56,031 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:40:56,031 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:40:56,031 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:40:56,033 INFO [pool-120-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$1 (Estimator.scala:estimate(56)) - 3 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:40:56,035 INFO [pool-120-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$1 (RuntimeReducerEstimator.scala:estimate(111)) -
History items have the following numbers of tasks:
List(3, 3, 3),
and the following numbers of tasks have valid task histories:
List(3, 3, 3)
2017-08-14 16:40:56,038 INFO [pool-120-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$1 (RuntimeReducerEstimator.scala:estimate(129)) -
- Typical job time: Some(3343.3333333333335)
- Desired runtime: 25
- Estimate: Some(134)
2017-08-14 16:40:56,038 INFO [pool-120-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$1 (Estimator.scala:estimate(58)) - class com.twitter.scalding.reducer_estimation.RuntimeReducerEstimator$$anon$1 estimate: Some(134)
2017-08-14 16:40:56,038 INFO [pool-120-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:40:57,171 INFO [pool-120-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1540772632_0008
2017-08-14 16:40:57,171 INFO [pool-120-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:40:57,307 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:40:57,323 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:40:57,324 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,327 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:40:57,327 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,440 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,440 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,443 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,443 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,459 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,459 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,462 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,462 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,471 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,471 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,473 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,474 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,491 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,491 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,493 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,493 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,508 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,508 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,510 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,510 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,525 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,526 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,532 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,532 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,552 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,552 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,554 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,554 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,574 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,574 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,577 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,577 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,589 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,589 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,592 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,592 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,607 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,607 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,609 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,610 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,625 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,625 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,629 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,629 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,656 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,656 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,659 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,659 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,692 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,693 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,696 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,696 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,718 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,718 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,720 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,721 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,741 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,741 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,744 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,744 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,773 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,773 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,778 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,778 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,802 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,802 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,806 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,806 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,839 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,839 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,843 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,844 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,861 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,861 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,863 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,864 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,888 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,888 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,892 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,892 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,909 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,910 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,912 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,912 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,922 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,922 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,924 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,925 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,940 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,940 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,943 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,943 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,958 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,958 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,960 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,960 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,970 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,970 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,972 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,973 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:57,993 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:57,993 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:57,996 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:57,996 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,006 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,006 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,008 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,009 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,025 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,026 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,033 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,034 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,048 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,048 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,051 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,051 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,074 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,074 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,078 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,078 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,099 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,099 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,101 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,102 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,119 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,119 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,123 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,123 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,159 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,159 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,168 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,168 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,198 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,198 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,201 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,201 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,226 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,226 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,229 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,230 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,244 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,244 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,247 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,247 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,262 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,262 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,264 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,265 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,278 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,278 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,281 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,281 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,300 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,300 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,303 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,303 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,319 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,320 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,322 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,322 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,341 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,341 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,343 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,344 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,353 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,353 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,358 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,358 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,375 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,375 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,378 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,378 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,402 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,402 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,404 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,404 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,421 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,421 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,423 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,424 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,440 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,440 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,444 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,446 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,468 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,468 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,471 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,471 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,501 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,501 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,503 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,503 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,516 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,516 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,519 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,519 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,544 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,544 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,547 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,548 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,571 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,571 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,574 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,575 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,602 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,602 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,605 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,605 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,620 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,620 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,623 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,623 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,645 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,645 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,648 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,648 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,662 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,662 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,665 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,665 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,685 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,686 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,690 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,690 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,703 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,703 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,706 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,706 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,720 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,720 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,724 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,724 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,746 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,746 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,749 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,749 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,759 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,759 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,764 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,764 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,782 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,782 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,786 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,786 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,797 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,798 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,800 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,801 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,839 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,839 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,846 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,846 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,866 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,866 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,868 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,868 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,890 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,890 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,901 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,902 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,918 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,918 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,920 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,920 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,949 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,949 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,951 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,951 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,968 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,968 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,971 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,971 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:58,984 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:58,984 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:58,990 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:58,990 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,008 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,008 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,010 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,010 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,023 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,023 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,025 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,025 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,045 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,045 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,049 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,049 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,064 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,064 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,066 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,066 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,087 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,087 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,090 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,090 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,103 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,103 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,105 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,105 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,121 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,124 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,127 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,128 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,140 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,140 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,142 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,142 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,155 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,156 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,157 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,157 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,179 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,180 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,184 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,184 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,196 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,196 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,199 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,199 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,217 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,217 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,219 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,220 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,239 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,239 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,242 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,242 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,260 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,260 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,263 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,263 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,282 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,282 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,286 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,286 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,307 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,307 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,309 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,309 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,320 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,320 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,324 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,324 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,343 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,343 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,345 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,346 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,365 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,365 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,367 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,367 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,385 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,385 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,389 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,389 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,415 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,415 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,417 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,417 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,437 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,437 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,440 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,440 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,459 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,459 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,461 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,461 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,477 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,477 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,481 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,481 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,492 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,492 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,495 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,495 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,515 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,515 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,517 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,517 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,528 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,528 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,531 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,531 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,544 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,544 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,546 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,546 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,560 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,560 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,562 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,562 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,577 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,577 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,581 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,581 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,598 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,598 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,600 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,600 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,614 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,614 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,616 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,616 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,636 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,636 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,640 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,640 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,660 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,660 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,663 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,663 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,674 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,674 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,678 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,678 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,694 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,694 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,698 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,698 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,716 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,716 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,719 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,719 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,735 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,735 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,739 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,739 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,758 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,758 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,761 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,761 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,786 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,786 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,790 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,790 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,812 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,812 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,814 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,814 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,841 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,841 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,843 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,843 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,864 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,864 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,867 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,867 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,888 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,888 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,891 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,892 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,902 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,902 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,904 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,904 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,918 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,918 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,921 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,921 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,942 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,942 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,944 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,945 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,959 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,959 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,961 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,961 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,975 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,975 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,981 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,981 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:40:59,994 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:40:59,994 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:40:59,997 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:40:59,997 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,013 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,013 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,015 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,015 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,025 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,025 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,030 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,030 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,054 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,054 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,056 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,057 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,082 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,082 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,086 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,086 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,104 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,104 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,106 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,106 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,127 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,127 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,131 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,131 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,152 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,152 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,155 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,155 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,181 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,181 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,185 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,185 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,207 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,207 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,210 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,210 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,232 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,232 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,239 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,239 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,258 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,258 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,261 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,261 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,275 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,275 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,279 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,279 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,302 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,302 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,305 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,305 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,322 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,322 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,325 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,325 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:00,352 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:00,352 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:00,355 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)875a3123-5255-42f0-b052-27cc681745a4)[by:[{1}:'key']]
2017-08-14 16:41:00,355 INFO [pool-123-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:02,201 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should set reducers correctly with mean estimation scheme ignoring input size
2017-08-14 16:41:02,254 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:41:02,254 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:41:02,259 INFO [pool-1-thread-1-ScalaTest-running-RuntimeReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:02,270 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:41:02,270 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:02,270 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:02,270 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:41:02,270 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:41:02,270 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:41:02,272 INFO [pool-124-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$1 (Estimator.scala:estimate(56)) - 3 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:41:02,273 INFO [pool-124-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$1 (RuntimeReducerEstimator.scala:estimate(111)) -
History items have the following numbers of tasks:
List(3, 3, 3),
and the following numbers of tasks have valid task histories:
List(3, 3, 3)
2017-08-14 16:41:02,273 INFO [pool-124-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$1 (RuntimeReducerEstimator.scala:estimate(129)) -
- Typical job time: Some(3000.0)
- Desired runtime: 25
- Estimate: Some(120)
2017-08-14 16:41:02,273 INFO [pool-124-thread-1] reducer_estimation.RuntimeReducerEstimator$$anon$1 (Estimator.scala:estimate(58)) - class com.twitter.scalding.reducer_estimation.RuntimeReducerEstimator$$anon$1 estimate: Some(120)
2017-08-14 16:41:02,274 INFO [pool-124-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:41:04,147 INFO [pool-124-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local797404864_0009
2017-08-14 16:41:04,147 INFO [pool-124-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:41:04,269 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:41:04,283 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:41:04,283 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,286 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:04,286 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,377 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,377 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,379 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,379 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,390 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,390 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,392 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,392 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,406 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,406 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,408 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,408 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,420 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,420 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,423 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,423 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,433 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,433 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,435 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,435 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,450 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,450 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,452 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,452 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,469 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,470 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,472 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,472 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,482 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,482 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,485 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,485 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,510 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,510 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,512 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,512 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,532 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,532 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,536 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,536 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,555 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,556 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,558 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,558 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,577 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,577 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,581 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,581 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,598 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,599 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,604 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,604 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,618 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,619 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,621 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,621 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,645 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,645 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,648 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,648 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,662 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,662 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,665 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,665 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,681 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,681 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,685 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,685 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,703 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,703 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,705 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,705 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,721 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,721 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,723 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,724 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,745 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,745 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,748 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,748 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,761 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,761 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,764 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,764 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,780 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,780 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,784 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,785 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,801 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,801 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,804 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,804 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,816 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,816 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,818 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,818 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,840 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,840 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,842 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,843 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,856 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,856 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,858 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,858 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,872 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,872 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,878 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,878 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,896 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,896 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,898 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,899 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,917 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,917 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,920 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,920 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,941 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,941 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,944 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,944 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,963 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,964 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,967 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,967 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,977 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,977 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:04,980 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:04,981 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:04,998 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:04,999 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,004 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,004 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,019 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,019 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,021 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,021 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,042 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,042 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,044 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,045 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,063 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,063 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,066 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,066 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,084 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,084 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,088 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,088 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,113 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,113 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,116 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,116 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,135 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,135 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,139 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,139 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,153 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,153 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,156 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,156 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,172 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,172 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,175 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,176 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,218 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,218 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,221 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,221 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,242 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,242 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,244 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,244 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,260 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,261 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,263 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,263 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,272 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,272 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,275 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,275 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,289 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,289 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,292 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,293 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,314 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,314 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,317 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,317 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,336 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,336 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,340 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,340 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,354 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,354 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,357 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,358 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,379 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,379 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,383 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,383 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,394 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,394 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,397 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,397 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,415 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,415 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,417 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,417 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,427 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,427 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,430 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,430 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,454 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,454 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,456 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,456 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,477 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,477 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,481 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,481 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,503 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,503 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,505 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,505 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,519 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,519 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,521 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,521 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,543 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,543 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,545 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,546 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,563 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,563 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,565 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,566 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,582 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,582 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,587 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,587 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,602 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,602 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,604 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,604 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,620 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,620 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,623 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,623 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,641 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,642 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,644 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,644 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,654 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,655 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,657 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,657 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,672 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,672 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,676 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,676 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,694 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,694 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,697 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,697 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,713 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,713 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,716 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,717 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,740 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,740 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,743 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,743 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,756 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,757 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,762 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,762 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,784 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,785 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,790 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,790 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,805 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,805 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,807 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,808 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,819 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,819 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,821 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,821 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,844 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,844 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,847 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,848 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,862 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,862 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,865 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,865 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,885 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,885 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,888 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,888 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,905 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,905 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,907 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,907 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,921 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,921 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,924 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,924 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,945 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,945 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,948 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,948 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,961 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,961 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,964 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,964 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:05,978 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:05,978 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:05,984 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:05,984 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,000 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,000 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,002 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,002 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,016 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,016 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,018 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,018 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,038 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,038 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,041 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,041 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,055 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,055 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,057 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,057 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,073 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,073 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,076 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,076 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,101 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,101 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,104 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,104 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,123 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,123 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,127 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,127 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,151 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,151 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,154 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,154 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,171 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,172 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,175 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,175 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,197 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,197 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,200 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,200 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,210 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,210 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,212 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,212 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,232 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,233 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,237 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,237 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,259 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,259 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,262 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,262 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,274 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,274 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,278 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,278 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,298 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,298 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,300 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,300 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,315 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,315 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,317 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,317 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,340 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,340 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,344 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,344 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,360 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,360 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,362 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,362 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,379 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,379 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,383 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,384 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,399 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,399 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,401 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,401 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,415 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,415 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,417 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,417 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,439 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,440 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,443 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,443 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,459 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,459 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,461 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,462 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,481 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,481 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,484 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,485 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,502 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,502 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,504 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,504 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,518 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,518 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,520 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,520 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,542 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,542 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,545 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,545 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,561 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,561 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,563 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,563 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,572 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,573 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,576 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,576 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,599 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,599 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,601 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,601 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,618 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,619 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,622 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,622 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,643 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,643 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,645 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,645 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,664 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,664 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,667 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,667 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,686 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,687 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,690 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,690 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,711 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,711 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,715 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,715 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,737 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,737 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,740 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,740 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,760 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,760 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,762 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,762 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,781 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,781 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,784 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,784 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,801 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,801 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,806 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,806 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:06,821 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:06,821 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:06,823 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)44c40184-3b95-47c7-9a59-5128cadfd92d)[by:[{1}:'key']]
2017-08-14 16:41:06,823 INFO [pool-127-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:09,168 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should set reducers correctly with median estimation scheme ignoring input size
[info] - should not set reducers when history service is empty
[info] - should not set reducers when history service fails
2017-08-14 16:41:10,009 ERROR [ResourceManager Event Processor] resourcemanager.ResourceManager (ResourceManager.java:run(594)) - Returning, interrupted : java.lang.InterruptedException
2017-08-14 16:41:10,010 ERROR [Thread[Thread-3054,5,main]] delegation.AbstractDelegationTokenSecretManager (AbstractDelegationTokenSecretManager.java:run(552)) - ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted
2017-08-14 16:41:10,120 ERROR [Thread[Thread-3033,5,main]] delegation.AbstractDelegationTokenSecretManager (AbstractDelegationTokenSecretManager.java:run(552)) - ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted
[info] ReducerEstimatorTest:
Formatting using clusterid: testClusterID
Aug 14, 2017 4:41:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices as a root resource class
Aug 14, 2017 4:41:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:41:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:41:24 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:41:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices to GuiceManagedComponentProvider with the scope "PerRequest"
Aug 14, 2017 4:41:24 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:41:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:41:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices as a root resource class
Aug 14, 2017 4:41:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:41:25 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:41:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:26 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:41:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:41:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:41:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:41:26 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:41:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:26 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:41:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:41:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:41:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:41:27 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:41:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:27 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:41:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:41:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:41:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:41:28 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:41:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:28 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:41:28 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
[info] Single-step job with reducer estimator
2017-08-14 16:41:31,662 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:41:31,662 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:41:31,665 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:31,672 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:41:31,672 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:31,672 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:31,672 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:41:31,673 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:41:31,673 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:41:31,674 INFO [pool-171-thread-1] reducer_estimation.ReducerEstimatorStepStrategy$ (ReducerEstimatorStepStrategy.scala:apply(36)) -
Flow step (1/1) counts.tsv was configured with reducers
set explicitly (scalding.with.reducers.set.explicitly=true) and the estimator
explicit override turned off (scalding.reducer.estimator.override=false). Skipping
reducer estimation.
2017-08-14 16:41:31,674 INFO [pool-171-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:41:32,650 INFO [pool-171-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local106390858_0010
2017-08-14 16:41:32,650 INFO [pool-171-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:41:32,772 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:41:32,813 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:41:32,813 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:41:32,825 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:32,825 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)7a998ef4-29aa-4e9b-804c-8518c81068bc)[by:[{1}:'key']]
2017-08-14 16:41:32,849 INFO [pool-174-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:32,849 INFO [pool-174-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:32,851 INFO [pool-174-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)7a998ef4-29aa-4e9b-804c-8518c81068bc)[by:[{1}:'key']]
2017-08-14 16:41:32,852 INFO [pool-174-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:32,868 INFO [pool-174-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:32,868 INFO [pool-174-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:32,871 INFO [pool-174-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)7a998ef4-29aa-4e9b-804c-8518c81068bc)[by:[{1}:'key']]
2017-08-14 16:41:32,872 INFO [pool-174-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:37,653 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should run with correct number of reducers
2017-08-14 16:41:37,699 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:41:37,700 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:41:37,702 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/*.txt"]
2017-08-14 16:41:37,709 INFO [flow com.twitter.scalding.reducer_estimation.SimpleGlobJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:41:37,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleGlobJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/*.txt"]
2017-08-14 16:41:37,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleGlobJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:37,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleGlobJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:41:37,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleGlobJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:41:37,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleGlobJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:41:37,712 INFO [pool-175-thread-1] reducer_estimation.InputSizeReducerEstimator$ (InputSizeReducerEstimator.scala:estimateReducersWithoutRounding(50)) -
InputSizeReducerEstimator
- input size (bytes): 2496
- reducer estimate: 2.4375
- Breakdown:
- Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/*.txt"] 2496
2017-08-14 16:41:37,712 INFO [pool-175-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:41:38,775 INFO [pool-175-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1865855716_0011
2017-08-14 16:41:38,775 INFO [pool-175-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:41:38,914 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:41:38,935 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:41:38,935 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:41:38,939 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/*.txt"]
2017-08-14 16:41:38,939 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/*.txt)1f53c552-8958-4db3-99be-ac67c19e5f06)[by:[{1}:'key']]
2017-08-14 16:41:38,966 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:38,966 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:38,968 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/*.txt)1f53c552-8958-4db3-99be-ac67c19e5f06)[by:[{1}:'key']]
2017-08-14 16:41:38,968 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:38,988 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:38,988 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:38,992 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/*.txt)1f53c552-8958-4db3-99be-ac67c19e5f06)[by:[{1}:'key']]
2017-08-14 16:41:38,992 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:39,010 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:39,010 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:39,013 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/*.txt)1f53c552-8958-4db3-99be-ac67c19e5f06)[by:[{1}:'key']]
2017-08-14 16:41:39,013 INFO [pool-178-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:43,777 INFO [flow com.twitter.scalding.reducer_estimation.SimpleGlobJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should run with correct number of reducers when we have a glob pattern in path
2017-08-14 16:41:43,812 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:41:43,813 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:41:43,816 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:43,824 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:41:43,824 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:43,824 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:43,825 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:41:43,825 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:41:43,825 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:41:43,826 INFO [pool-179-thread-1] reducer_estimation.InputSizeReducerEstimator$ (InputSizeReducerEstimator.scala:estimateReducersWithoutRounding(50)) -
InputSizeReducerEstimator
- input size (bytes): 2496
- reducer estimate: 2.4375
- Breakdown:
- Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"] 2496
2017-08-14 16:41:43,826 INFO [pool-179-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:41:44,917 INFO [pool-179-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1099223094_0012
2017-08-14 16:41:44,917 INFO [pool-179-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:41:45,042 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:41:45,057 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:41:45,057 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:41:45,060 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:45,060 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:45,094 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:45,094 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:45,096 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:45,096 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:45,112 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:45,113 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:45,115 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:45,115 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:45,131 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:45,131 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:45,134 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:45,134 INFO [pool-182-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:49,923 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should run with correct number of reducers when overriding set values
2017-08-14 16:41:49,954 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:41:49,955 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:41:49,957 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:49,963 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:41:49,963 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:49,963 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:49,963 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:41:49,963 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:41:49,963 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:41:49,964 INFO [pool-183-thread-1] reducer_estimation.InputSizeReducerEstimator$ (InputSizeReducerEstimator.scala:estimateReducersWithoutRounding(50)) -
InputSizeReducerEstimator
- input size (bytes): 2496
- reducer estimate: 2496.0
- Breakdown:
- Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"] 2496
2017-08-14 16:41:49,964 WARN [pool-183-thread-1] reducer_estimation.ReducerEstimatorStepStrategy$ (ReducerEstimatorStepStrategy.scala:apply$mcII$sp(85)) -
Reducer estimator estimated 2496 reducers, which is more than the configured maximum of 10.
Will use 10 instead.
2017-08-14 16:41:49,965 INFO [pool-183-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:41:50,978 INFO [pool-183-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1056658989_0013
2017-08-14 16:41:50,979 INFO [pool-183-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:41:51,131 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:41:51,148 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:41:51,148 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,153 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:41:51,153 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,194 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,194 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,197 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,197 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:51,216 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,216 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,219 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,219 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:51,237 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,238 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,240 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,241 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:51,257 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,257 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,260 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,260 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:51,277 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,277 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,279 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,279 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:51,297 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,298 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,300 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,300 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:51,319 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,320 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,324 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,324 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:51,344 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,344 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,347 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,347 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:51,365 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,365 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,367 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,367 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:51,383 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:41:51,383 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:41:51,387 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)1accd343-14dc-4ac6-82e0-cbb63fba644f)[by:[{1}:'key']]
2017-08-14 16:41:51,388 INFO [pool-186-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:55,985 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should respect cap when estimated reducers is above the configured max
2017-08-14 16:41:56,026 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:41:56,027 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:41:56,043 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMemoryJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:41:56,043 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMemoryJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/04d9d213-63da-4c59-8a65-34370a64bf09"]
2017-08-14 16:41:56,043 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMemoryJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:41:56,043 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMemoryJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:41:56,043 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMemoryJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:41:56,043 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMemoryJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:41:56,044 WARN [pool-187-thread-1] estimation.Common$ (Common.scala:apply(30)) - InputSizeReducerEstimator unable to calculate size: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/04d9d213-63da-4c59-8a65-34370a64bf09"]
2017-08-14 16:41:56,045 WARN [pool-187-thread-1] reducer_estimation.InputSizeReducerEstimator$ (InputSizeReducerEstimator.scala:estimateReducersWithoutRounding(35)) - InputSizeReducerEstimator unable to estimate reducers; cannot compute size of (is it a non hfs tap?):
- MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/04d9d213-63da-4c59-8a65-34370a64bf09"]
2017-08-14 16:41:56,045 INFO [pool-187-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:41:57,113 INFO [pool-187-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_1502743284235_0001
2017-08-14 16:41:57,113 INFO [pool-187-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://MM-MAC-3270:52856/proxy/application_1502743284235_0001/
2017-08-14 16:41:59,425 ERROR [DataXceiver for client DFSClient_NONMAPREDUCE_-2111134446_13830 at /127.0.0.1:53134 [Sending block BP-1413808396-172.19.80.83-1502743270145:blk_1073741872_1048]] datanode.DataNode (DataXceiver.java:run(252)) - 127.0.0.1:52829:DataXceiver error processing READ_BLOCK operation src: /127.0.0.1:53134 dst: /127.0.0.1:52829
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Replica not found for BP-1413808396-172.19.80.83-1502743270145:blk_1073741872_1048
at org.apache.hadoop.hdfs.server.datanode.BlockSender.getReplica(BlockSender.java:419)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:228)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:41:59,426 ERROR [DataXceiver for client DFSClient_NONMAPREDUCE_-2111134446_13830 at /127.0.0.1:53127 [Sending block BP-1413808396-172.19.80.83-1502743270145:blk_1073741872_1048]] datanode.DataNode (DataXceiver.java:run(252)) - 127.0.0.1:52829:DataXceiver error processing READ_BLOCK operation src: /127.0.0.1:53127 dst: /127.0.0.1:52829
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Replica not found for BP-1413808396-172.19.80.83-1502743270145:blk_1073741872_1048
at org.apache.hadoop.hdfs.server.datanode.BlockSender.getReplica(BlockSender.java:419)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:228)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:41:59,427 ERROR [DataXceiver for client DFSClient_NONMAPREDUCE_-2111134446_13830 at /127.0.0.1:53137 [Sending block BP-1413808396-172.19.80.83-1502743270145:blk_1073741872_1048]] datanode.DataNode (DataXceiver.java:run(252)) - 127.0.0.1:52829:DataXceiver error processing READ_BLOCK operation src: /127.0.0.1:53137 dst: /127.0.0.1:52829
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Replica not found for BP-1413808396-172.19.80.83-1502743270145:blk_1073741872_1048
at org.apache.hadoop.hdfs.server.datanode.BlockSender.getReplica(BlockSender.java:419)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:228)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:42:02,169 WARN [pool-187-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] hadoop job job_1502743284235_0001 state at FAILED
2017-08-14 16:42:02,170 WARN [pool-187-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] failure info: Application application_1502743284235_0001 failed 2 times due to AM Container for appattempt_1502743284235_0001_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:42:02,178 WARN [pool-187-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] task completion events identify failed tasks
2017-08-14 16:42:02,179 WARN [pool-187-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] task completion events count: 0
2017-08-14 16:42:02,195 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMemoryJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] stopping all jobs
2017-08-14 16:42:02,197 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMemoryJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (1/1) counts.tsv
2017-08-14 16:42:02,200 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMemoryJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] stopped all jobs
[info] - should ignore memory source in input size estimation *** FAILED ***
[info] cascading.flow.FlowException: step failed: (1/1) counts.tsv, with job id: job_1502743284235_0001, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:42:02,243 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:42:02,243 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:42:02,251 INFO [flow com.twitter.scalding.reducer_estimation.SimpleFileNotFoundJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:42:02,251 INFO [flow com.twitter.scalding.reducer_estimation.SimpleFileNotFoundJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file.txt"]
2017-08-14 16:42:02,251 INFO [flow com.twitter.scalding.reducer_estimation.SimpleFileNotFoundJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
[info] - should throw FileNotFoundException during estimation
2017-08-14 16:42:02,251 INFO [flow com.twitter.scalding.reducer_estimation.SimpleFileNotFoundJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:42:02,251 INFO [flow com.twitter.scalding.reducer_estimation.SimpleFileNotFoundJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:42:02,251 INFO [flow com.twitter.scalding.reducer_estimation.SimpleFileNotFoundJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:42:02,253 INFO [flow com.twitter.scalding.reducer_estimation.SimpleFileNotFoundJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] stopping all jobs
2017-08-14 16:42:02,254 INFO [flow com.twitter.scalding.reducer_estimation.SimpleFileNotFoundJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (1/1) counts.tsv
2017-08-14 16:42:02,254 INFO [flow com.twitter.scalding.reducer_estimation.SimpleFileNotFoundJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] stopped all jobs
[info] Group-all job with reducer estimator
2017-08-14 16:42:02,310 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:42:02,310 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:42:02,312 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:42:02,325 INFO [flow com.twitter.scalding.reducer_estimation.GroupAllJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:42:02,325 INFO [flow com.twitter.scalding.reducer_estimation.GroupAllJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:42:02,326 INFO [flow com.twitter.scalding.reducer_estimation.GroupAllJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0]]"]["size.tsv"]
2017-08-14 16:42:02,326 INFO [flow com.twitter.scalding.reducer_estimation.GroupAllJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:42:02,326 INFO [flow com.twitter.scalding.reducer_estimation.GroupAllJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:42:02,326 INFO [flow com.twitter.scalding.reducer_estimation.GroupAllJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:42:02,327 INFO [pool-191-thread-1] reducer_estimation.ReducerEstimatorStepStrategy$ (ReducerEstimatorStepStrategy.scala:apply(36)) -
Flow step (1/1) size.tsv was configured with reducers
set explicitly (scalding.with.reducers.set.explicitly=true) and the estimator
explicit override turned off (scalding.reducer.estimator.override=false). Skipping
reducer estimation.
2017-08-14 16:42:02,327 INFO [pool-191-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) size.tsv
2017-08-14 16:42:03,262 INFO [pool-191-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local817171344_0014
2017-08-14 16:42:03,262 INFO [pool-191-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:42:03,384 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:42:03,423 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:42:03,423 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:42:03,427 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:42:03,427 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)418976f9-5ce1-4a4e-9b88-065ba5dbc86d)[by:[{1}:'key']]
2017-08-14 16:42:03,459 INFO [pool-194-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:42:03,459 INFO [pool-194-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:42:03,462 INFO [pool-194-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)418976f9-5ce1-4a4e-9b88-065ba5dbc86d)[by:[{1}:'key']]
2017-08-14 16:42:03,462 INFO [pool-194-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0]]"]["size.tsv"]
2017-08-14 16:42:08,267 INFO [flow com.twitter.scalding.reducer_estimation.GroupAllJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path size.tsv/_temporary
[info] - should run with correct number of reducers (i.e. 1)
[info] Multi-step job with reducer estimator
2017-08-14 16:42:08,362 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:42:08,362 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:42:08,371 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextDelimited[[0:1]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/scores.tsv"]
2017-08-14 16:42:08,372 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:42:08,445 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:42:08,445 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:42:08,445 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextDelimited[[0:1]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/scores.tsv"]
2017-08-14 16:42:08,445 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0]]"]["output"]
2017-08-14 16:42:08,445 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:42:08,445 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 3
2017-08-14 16:42:08,445 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 3
2017-08-14 16:42:08,448 INFO [pool-195-thread-1] reducer_estimation.InputSizeReducerEstimator$ (InputSizeReducerEstimator.scala:estimateReducersWithoutRounding(50)) -
InputSizeReducerEstimator
- input size (bytes): 2670
- reducer estimate: 2.607421875
- Breakdown:
- Hfs["TextDelimited[[0:1]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/scores.tsv"] 174
- Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"] 2496
2017-08-14 16:42:08,448 INFO [pool-195-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/3)
2017-08-14 16:42:09,330 INFO [pool-195-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local632206655_0015
2017-08-14 16:42:09,330 INFO [pool-195-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:42:09,448 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:42:09,464 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:42:09,464 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:42:09,475 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:42:09,475 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: CoGroup(_pipe_0-94c38404a39c*_pipe_1-ff2519fcfd94)[by: _pipe_0-94c38404a39c:[{1}:'key0'] _pipe_1-ff2519fcfd94:[{1}:'key1']]
2017-08-14 16:42:09,497 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/scores.tsv
2017-08-14 16:42:09,511 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:42:09,511 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:42:09,514 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextDelimited[[0:1]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/scores.tsv"]
2017-08-14 16:42:09,515 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: CoGroup(_pipe_0-94c38404a39c*_pipe_1-ff2519fcfd94)[by: _pipe_0-94c38404a39c:[{1}:'key0'] _pipe_1-ff2519fcfd94:[{1}:'key1']]
2017-08-14 16:42:09,537 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:42:09,537 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:42:09,541 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: CoGroup(_pipe_0-94c38404a39c*_pipe_1-ff2519fcfd94)[by: _pipe_0-94c38404a39c:[{1}:'key0'] _pipe_1-ff2519fcfd94:[{1}:'key1']]
2017-08-14 16:42:09,541 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: TempHfs["SequenceFile[['key', 'value']]"][9502647427/_pipe_0-94c38404a39c__pip/]
2017-08-14 16:42:09,563 INFO [pool-198-thread-1] collect.SpillableTupleList (SpillableTupleList.java:getCodecClass(105)) - attempting to load codec: org.apache.hadoop.io.compress.GzipCodec
2017-08-14 16:42:09,563 INFO [pool-198-thread-1] collect.SpillableTupleList (SpillableTupleList.java:getCodecClass(110)) - found codec: org.apache.hadoop.io.compress.GzipCodec
2017-08-14 16:42:09,600 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:42:09,600 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:42:09,604 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: CoGroup(_pipe_0-94c38404a39c*_pipe_1-ff2519fcfd94)[by: _pipe_0-94c38404a39c:[{1}:'key0'] _pipe_1-ff2519fcfd94:[{1}:'key1']]
2017-08-14 16:42:09,604 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: TempHfs["SequenceFile[['key', 'value']]"][9502647427/_pipe_0-94c38404a39c__pip/]
2017-08-14 16:42:09,609 INFO [pool-198-thread-1] collect.SpillableTupleList (SpillableTupleList.java:getCodecClass(105)) - attempting to load codec: org.apache.hadoop.io.compress.GzipCodec
2017-08-14 16:42:09,609 INFO [pool-198-thread-1] collect.SpillableTupleList (SpillableTupleList.java:getCodecClass(110)) - found codec: org.apache.hadoop.io.compress.GzipCodec
2017-08-14 16:42:09,634 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:42:09,634 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:42:09,638 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: CoGroup(_pipe_0-94c38404a39c*_pipe_1-ff2519fcfd94)[by: _pipe_0-94c38404a39c:[{1}:'key0'] _pipe_1-ff2519fcfd94:[{1}:'key1']]
2017-08-14 16:42:09,639 INFO [pool-198-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: TempHfs["SequenceFile[['key', 'value']]"][9502647427/_pipe_0-94c38404a39c__pip/]
2017-08-14 16:42:09,644 INFO [pool-198-thread-1] collect.SpillableTupleList (SpillableTupleList.java:getCodecClass(105)) - attempting to load codec: org.apache.hadoop.io.compress.GzipCodec
2017-08-14 16:42:09,644 INFO [pool-198-thread-1] collect.SpillableTupleList (SpillableTupleList.java:getCodecClass(110)) - found codec: org.apache.hadoop.io.compress.GzipCodec
2017-08-14 16:42:14,343 INFO [pool-195-thread-2] reducer_estimation.InputSizeReducerEstimator$ (InputSizeReducerEstimator.scala:estimateReducersWithoutRounding(50)) -
InputSizeReducerEstimator
- input size (bytes): 546
- reducer estimate: 0.533203125
- Breakdown:
- TempHfs["SequenceFile[['key', 'value']]"][9502647427/_pipe_0-94c38404a39c__pip/] 546
2017-08-14 16:42:14,343 INFO [pool-195-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (2/3)
2017-08-14 16:42:15,020 INFO [pool-195-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_1502743284235_0002
2017-08-14 16:42:15,020 INFO [pool-195-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://MM-MAC-3270:52856/proxy/application_1502743284235_0002/
2017-08-14 16:42:20,029 WARN [pool-195-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] hadoop job job_1502743284235_0002 state at FAILED
2017-08-14 16:42:20,030 WARN [pool-195-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] failure info: Application application_1502743284235_0002 failed 2 times due to AM Container for appattempt_1502743284235_0002_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:42:20,030 WARN [pool-195-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] task completion events identify failed tasks
2017-08-14 16:42:20,031 WARN [pool-195-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] task completion events count: 0
2017-08-14 16:42:20,035 WARN [pool-195-thread-3] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] abandoning step: (3/3) output, predecessor failed: (2/3)
2017-08-14 16:42:20,035 INFO [pool-195-thread-3] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (3/3) output
2017-08-14 16:42:20,036 INFO [pool-195-thread-3] reducer_estimation.ReducerEstimatorStepStrategy$ (ReducerEstimatorStepStrategy.scala:apply(36)) -
Flow step (3/3) output was configured with reducers
set explicitly (scalding.with.reducers.set.explicitly=true) and the estimator
explicit override turned off (scalding.reducer.estimator.override=false). Skipping
reducer estimation.
2017-08-14 16:42:20,036 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] stopping all jobs
2017-08-14 16:42:20,036 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (3/3) output
2017-08-14 16:42:20,037 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (2/3)
2017-08-14 16:42:20,040 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (1/3)
2017-08-14 16:42:20,041 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] stopped all jobs
[info] - should run with correct number of reducers in each step *** FAILED ***
[info] cascading.flow.FlowException: step failed: (2/3), with job id: job_1502743284235_0002, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] Map-only job with reducer estimator
2017-08-14 16:42:20,074 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:42:20,074 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:42:20,077 INFO [pool-1-thread-1-ScalaTest-running-ReducerEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:42:20,082 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMapOnlyJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:42:20,082 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMapOnlyJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:42:20,082 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMapOnlyJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0]]"]["mapped_output"]
2017-08-14 16:42:20,082 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMapOnlyJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:42:20,082 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMapOnlyJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:42:20,082 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMapOnlyJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:42:20,086 INFO [pool-201-thread-1] reducer_estimation.ReducerEstimatorStepStrategy$ (ReducerEstimatorStepStrategy.scala:apply(33)) - com.twitter.scalding.reducer_estimation.SimpleMapOnlyJob is a map-only step. Skipping reducer estimation.
2017-08-14 16:42:20,086 INFO [pool-201-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) mapped_output
2017-08-14 16:42:21,870 INFO [pool-201-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1255010012_0016
2017-08-14 16:42:21,871 INFO [pool-201-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:42:21,985 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:42:21,994 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:42:21,994 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:42:21,996 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:42:21,996 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: Hfs["TextDelimited[[0]]"]["mapped_output"]
2017-08-14 16:42:26,874 INFO [flow com.twitter.scalding.reducer_estimation.SimpleMapOnlyJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path mapped_output/_temporary
[info] - should not set num reducers
2017-08-14 16:43:09,884 ERROR [ResourceManager Event Processor] resourcemanager.ResourceManager (ResourceManager.java:run(594)) - Returning, interrupted : java.lang.InterruptedException
2017-08-14 16:43:09,885 ERROR [Thread[Thread-9756,5,main]] delegation.AbstractDelegationTokenSecretManager (AbstractDelegationTokenSecretManager.java:run(552)) - ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted
2017-08-14 16:43:09,996 ERROR [Thread[Thread-9735,5,main]] delegation.AbstractDelegationTokenSecretManager (AbstractDelegationTokenSecretManager.java:run(552)) - ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted
[info] MemoryEstimatorTest:
Formatting using clusterid: testClusterID
Aug 14, 2017 4:43:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices as a root resource class
Aug 14, 2017 4:43:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:43:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:43:23 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:43:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:23 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices to GuiceManagedComponentProvider with the scope "PerRequest"
Aug 14, 2017 4:43:23 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:43:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:43:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices as a root resource class
Aug 14, 2017 4:43:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:43:24 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:43:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:24 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:24 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:43:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:43:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:43:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:43:25 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:43:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:25 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:25 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:43:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:43:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:43:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:43:26 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:43:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:26 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:26 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:43:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:43:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:43:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:43:27 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:43:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:27 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:43:27 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
[info] Single-step job with memory estimator
2017-08-14 16:43:30,706 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:43:30,706 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:43:30,709 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:30,715 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:43:30,715 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:30,715 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:30,715 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:43:30,715 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:43:30,715 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:43:30,719 WARN [pool-247-thread-1] memory.EmptySmoothedMemoryEstimator (Estimator.scala:estimate(53)) - No matching history found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJob: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:43:30,719 INFO [pool-247-thread-1] memory.MemoryEstimatorStepStrategy$ (MemoryEstimatorStepStrategy.scala:apply(71)) - Memory estimators didn't calculate any value. Skipping setting memory overrides
2017-08-14 16:43:30,719 INFO [pool-247-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:43:31,669 INFO [pool-247-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local34063145_0017
2017-08-14 16:43:31,669 INFO [pool-247-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:43:31,788 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:43:31,801 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:43:31,801 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:43:31,804 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:31,804 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:31,844 INFO [pool-250-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:43:31,844 INFO [pool-250-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:43:31,846 INFO [pool-250-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:31,846 INFO [pool-250-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:31,868 INFO [pool-250-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:43:31,868 INFO [pool-250-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:43:31,870 INFO [pool-250-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:31,871 INFO [pool-250-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:36,672 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should without history don't override memory settings
2017-08-14 16:43:36,698 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:43:36,698 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:43:36,700 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:36,709 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:43:36,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:36,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:36,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:43:36,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:43:36,710 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:43:36,713 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (Estimator.scala:estimate(56)) - 8 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJob: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:43:36,715 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(838860800) reduce: None
2017-08-14 16:43:36,716 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(838860800)
2017-08-14 16:43:36,716 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(1073741824) reduce: None
2017-08-14 16:43:36,716 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(1073741824)
2017-08-14 16:43:36,716 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(1363148800) reduce: None
2017-08-14 16:43:36,716 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(1363148800)
2017-08-14 16:43:36,716 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(758120448) reduce: None
2017-08-14 16:43:36,716 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(758120448)
2017-08-14 16:43:36,717 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(838860800, 1073741824, 1363148800, 758120448) smoothAvg = 7.58120448E8, scaled: 9.097445376E8
2017-08-14 16:43:36,718 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(838860800, 1073741824, 1363148800, 758120448) smoothAvg = 7.58120448E8, scaled: 9.097445376E8
2017-08-14 16:43:36,718 INFO [pool-251-thread-1] memory.SmoothedMemoryEstimatorWithData (Estimator.scala:estimate(58)) - class com.twitter.scalding.estimation.memory.SmoothedMemoryEstimatorWithData estimate: Some(MemoryEstimate(Some((1228,1536)),Some((1228,1536))))
2017-08-14 16:43:36,718 INFO [pool-251-thread-1] memory.MemoryEstimatorStepStrategy$ (MemoryEstimatorStepStrategy.scala:apply(62)) - Overriding map memory to: (1228,1536) in Mb and reduce memory to: (1228,1536) in Mb
2017-08-14 16:43:36,719 INFO [pool-251-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:43:38,510 INFO [pool-251-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1773562438_0018
2017-08-14 16:43:38,510 INFO [pool-251-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:43:38,628 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:43:38,642 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:43:38,643 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:43:38,646 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:38,646 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:38,672 INFO [pool-254-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:43:38,672 INFO [pool-254-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:43:38,674 INFO [pool-254-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:38,674 INFO [pool-254-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:38,691 INFO [pool-254-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:43:38,691 INFO [pool-254-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:43:38,694 INFO [pool-254-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:38,694 INFO [pool-254-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:43,513 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should run with correct number of memory
2017-08-14 16:43:43,541 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:43:43,541 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:43:43,545 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:43,550 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:43:43,550 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:43,550 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:43,550 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:43:43,550 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:43:43,550 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:43:43,552 INFO [pool-255-thread-1] memory.SmoothedMemoryEstimatorWithMoreThanMaxCap (Estimator.scala:estimate(56)) - 2 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJob: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:43:43,552 INFO [pool-255-thread-1] memory.SmoothedMemoryEstimatorWithMoreThanMaxCap (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(9663676416) reduce: None
2017-08-14 16:43:43,552 INFO [pool-255-thread-1] memory.SmoothedMemoryEstimatorWithMoreThanMaxCap (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(9663676416)
2017-08-14 16:43:43,552 INFO [pool-255-thread-1] memory.SmoothedMemoryEstimatorWithMoreThanMaxCap (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(9663676416) smoothAvg = 9.663676416E9, scaled: 1.1596411699199999E10
2017-08-14 16:43:43,552 INFO [pool-255-thread-1] memory.SmoothedMemoryEstimatorWithMoreThanMaxCap (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(9663676416) smoothAvg = 9.663676416E9, scaled: 1.1596411699199999E10
2017-08-14 16:43:43,552 INFO [pool-255-thread-1] memory.SmoothedMemoryEstimatorWithMoreThanMaxCap (Estimator.scala:estimate(58)) - class com.twitter.scalding.estimation.memory.SmoothedMemoryEstimatorWithMoreThanMaxCap estimate: Some(MemoryEstimate(Some((6553,8192)),Some((6553,8192))))
2017-08-14 16:43:43,552 INFO [pool-255-thread-1] memory.MemoryEstimatorStepStrategy$ (MemoryEstimatorStepStrategy.scala:apply(62)) - Overriding map memory to: (6553,8192) in Mb and reduce memory to: (6553,8192) in Mb
2017-08-14 16:43:43,552 INFO [pool-255-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:43:45,314 INFO [pool-255-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local918274119_0019
2017-08-14 16:43:45,314 INFO [pool-255-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:43:45,432 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:43:45,446 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:43:45,446 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:43:45,449 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:45,449 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:45,479 INFO [pool-258-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:43:45,479 INFO [pool-258-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:43:45,482 INFO [pool-258-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:45,482 INFO [pool-258-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:45,496 INFO [pool-258-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:43:45,496 INFO [pool-258-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:43:45,499 INFO [pool-258-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:45,499 INFO [pool-258-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:50,319 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should respect cap when estimated memory is above the configured max
2017-08-14 16:43:50,344 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:43:50,344 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:43:50,347 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:50,354 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:43:50,354 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:50,354 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:50,354 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:43:50,354 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:43:50,354 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:43:50,356 INFO [pool-259-thread-1] memory.SmoothedMemoryEstimatorWithLessThanMinCap (Estimator.scala:estimate(56)) - 2 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.SimpleJob: HadoopFlowStep[name: (1/1) counts.tsv],Buffer(),HadoopFlowStep[name: (1/1) counts.tsv])
2017-08-14 16:43:50,356 INFO [pool-259-thread-1] memory.SmoothedMemoryEstimatorWithLessThanMinCap (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(549453824) reduce: None
2017-08-14 16:43:50,356 INFO [pool-259-thread-1] memory.SmoothedMemoryEstimatorWithLessThanMinCap (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(549453824)
2017-08-14 16:43:50,357 INFO [pool-259-thread-1] memory.SmoothedMemoryEstimatorWithLessThanMinCap (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(549453824) smoothAvg = 5.49453824E8, scaled: 6.593445888E8
2017-08-14 16:43:50,357 INFO [pool-259-thread-1] memory.SmoothedMemoryEstimatorWithLessThanMinCap (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(549453824) smoothAvg = 5.49453824E8, scaled: 6.593445888E8
2017-08-14 16:43:50,357 INFO [pool-259-thread-1] memory.SmoothedMemoryEstimatorWithLessThanMinCap (Estimator.scala:estimate(58)) - class com.twitter.scalding.estimation.memory.SmoothedMemoryEstimatorWithLessThanMinCap estimate: Some(MemoryEstimate(Some((819,1024)),Some((819,1024))))
2017-08-14 16:43:50,357 INFO [pool-259-thread-1] memory.MemoryEstimatorStepStrategy$ (MemoryEstimatorStepStrategy.scala:apply(62)) - Overriding map memory to: (819,1024) in Mb and reduce memory to: (819,1024) in Mb
2017-08-14 16:43:50,357 INFO [pool-259-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:43:51,222 INFO [pool-259-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local1551643416_0020
2017-08-14 16:43:51,223 INFO [pool-259-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:43:51,347 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:43:51,392 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:43:51,392 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:43:51,395 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:51,396 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:51,423 INFO [pool-262-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:43:51,423 INFO [pool-262-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:43:51,426 INFO [pool-262-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:51,426 INFO [pool-262-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:51,441 INFO [pool-262-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:43:51,441 INFO [pool-262-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:43:51,443 INFO [pool-262-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:51,443 INFO [pool-262-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:56,228 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJob] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should respect cap when estimated memory is below the configured min
2017-08-14 16:43:56,257 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:43:56,258 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:43:56,260 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:56,267 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:43:56,267 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:56,267 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:43:56,267 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:43:56,267 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 1
2017-08-14 16:43:56,267 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 1
2017-08-14 16:43:56,268 WARN [pool-263-thread-1] memory.ErrorHistoryBasedMemoryEstimator (Estimator.scala:estimate(61)) - Unable to fetch history in class com.twitter.scalding.estimation.memory.ErrorHistoryBasedMemoryEstimator
java.lang.RuntimeException: Failed to fetch job history
at com.twitter.scalding.reducer_estimation.ErrorHistoryService$.fetchHistory(RatioBasedEstimatorTest.scala:29)
at com.twitter.scalding.estimation.HistoryEstimator$class.estimate(Estimator.scala:51)
at com.twitter.scalding.estimation.memory.ErrorHistoryBasedMemoryEstimator.estimate(MemoryEstimatorTest.scala:225)
at com.twitter.scalding.estimation.memory.MemoryEstimatorStepStrategy$$anonfun$estimate$1.apply(MemoryEstimatorStepStrategy.scala:58)
at com.twitter.scalding.estimation.memory.MemoryEstimatorStepStrategy$$anonfun$estimate$1.apply(MemoryEstimatorStepStrategy.scala:47)
at scala.Option.foreach(Option.scala:257)
at com.twitter.scalding.estimation.memory.MemoryEstimatorStepStrategy$.estimate(MemoryEstimatorStepStrategy.scala:47)
at com.twitter.scalding.estimation.memory.MemoryEstimatorStepStrategy$.apply(MemoryEstimatorStepStrategy.scala:34)
at cascading.flow.planner.FlowStepJob.applyFlowStepConfStrategy(FlowStepJob.java:187)
at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:148)
at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:43:56,268 INFO [pool-263-thread-1] memory.MemoryEstimatorStepStrategy$ (MemoryEstimatorStepStrategy.scala:apply(71)) - Memory estimators didn't calculate any value. Skipping setting memory overrides
2017-08-14 16:43:56,268 INFO [pool-263-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/1) counts.tsv
2017-08-14 16:43:57,160 INFO [pool-263-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local755916212_0021
2017-08-14 16:43:57,160 INFO [pool-263-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:43:57,276 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:43:57,290 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:43:57,290 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:43:57,293 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:43:57,293 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:57,316 INFO [pool-266-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:43:57,317 INFO [pool-266-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:43:57,319 INFO [pool-266-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: GroupBy(com.twitter.scalding.TextLine(file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt)0db0bf28-8c68-439e-b3cd-0b1c36a5b284)[by:[{1}:'key']]
2017-08-14 16:43:57,319 INFO [pool-266-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: Hfs["TextDelimited[[0:1]]"]["counts.tsv"]
2017-08-14 16:44:02,165 INFO [flow com.twitter.scalding.reducer_estimation.SimpleJobWithNoSetReducers] util.Hadoop18TapUtil (Hadoop18TapUtil.java:cleanTempPath(219)) - deleting temp path counts.tsv/_temporary
[info] - should not set memory when error fetching history
[info] Multi-step job with memory estimator
2017-08-14 16:44:02,202 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:44:02,203 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:44:02,207 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextDelimited[[0:1]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/scores.tsv"]
2017-08-14 16:44:02,208 INFO [pool-1-thread-1-ScalaTest-running-MemoryEstimatorTest] hadoop.Hfs (Hfs.java:makeLocal(507)) - forcing job to local mode, via source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:44:02,224 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting
2017-08-14 16:44:02,224 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextDelimited[[0:1]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/scores.tsv"]
2017-08-14 16:44:02,224 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] source: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:44:02,224 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] sink: Hfs["TextDelimited[[0]]"]["output"]
2017-08-14 16:44:02,224 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] parallel execution is enabled: true
2017-08-14 16:44:02,225 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] starting jobs: 3
2017-08-14 16:44:02,225 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] allocating threads: 3
2017-08-14 16:44:02,227 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (Estimator.scala:estimate(56)) - 8 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.HipJob: HadoopFlowStep[name: (1/3)]HadoopFlowStep[name: (2/3)]HadoopFlowStep[name: (3/3) output],Buffer(),HadoopFlowStep[name: (1/3)])
2017-08-14 16:44:02,227 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(838860800) reduce: None
2017-08-14 16:44:02,227 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(838860800)
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(1073741824) reduce: None
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(1073741824)
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(1363148800) reduce: None
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(1363148800)
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(758120448) reduce: None
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(758120448)
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(838860800, 1073741824, 1363148800, 758120448) smoothAvg = 7.58120448E8, scaled: 9.097445376E8
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(838860800, 1073741824, 1363148800, 758120448) smoothAvg = 7.58120448E8, scaled: 9.097445376E8
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.SmoothedMemoryEstimatorWithData (Estimator.scala:estimate(58)) - class com.twitter.scalding.estimation.memory.SmoothedMemoryEstimatorWithData estimate: Some(MemoryEstimate(Some((1228,1536)),Some((1228,1536))))
2017-08-14 16:44:02,228 INFO [pool-267-thread-1] memory.MemoryEstimatorStepStrategy$ (MemoryEstimatorStepStrategy.scala:apply(62)) - Overriding map memory to: (1228,1536) in Mb and reduce memory to: (1228,1536) in Mb
2017-08-14 16:44:02,229 INFO [pool-267-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (1/3)
2017-08-14 16:44:03,204 INFO [pool-267-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_local610601813_0022
2017-08-14 16:44:03,204 INFO [pool-267-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://localhost:8080/
2017-08-14 16:44:03,318 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt
2017-08-14 16:44:03,331 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:44:03,332 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:44:03,336 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextLine[['offset', 'line']->[ALL]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/hipster.txt"]
2017-08-14 16:44:03,336 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: CoGroup(_pipe_2-0b1c36a5b284*_pipe_3-52d0eb5a1699)[by: _pipe_2-0b1c36a5b284:[{1}:'key0'] _pipe_3-52d0eb5a1699:[{1}:'key1']]
2017-08-14 16:44:03,610 INFO [LocalJobRunner Map Task Executor #0] io.MultiInputSplit (MultiInputSplit.java:readFields(161)) - current split input path: file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/scores.tsv
2017-08-14 16:44:03,623 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(71)) - cascading version: 2.6.1
2017-08-14 16:44:03,623 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(72)) - child jvm opts: -Xmx512m
2017-08-14 16:44:03,628 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(87)) - sourcing from: Hfs["TextDelimited[[0:1]]"]["file:/Users/geri/work/scalding/scalding-estimators-test/target/scala-2.11/test-classes/scores.tsv"]
2017-08-14 16:44:03,628 INFO [LocalJobRunner Map Task Executor #0] hadoop.FlowMapper (FlowMapper.java:configure(90)) - sinking to: CoGroup(_pipe_2-0b1c36a5b284*_pipe_3-52d0eb5a1699)[by: _pipe_2-0b1c36a5b284:[{1}:'key0'] _pipe_3-52d0eb5a1699:[{1}:'key1']]
2017-08-14 16:44:03,655 INFO [pool-270-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(78)) - cascading version: 2.6.1
2017-08-14 16:44:03,655 INFO [pool-270-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(79)) - child jvm opts: -Xmx512m
2017-08-14 16:44:03,659 INFO [pool-270-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(97)) - sourcing from: CoGroup(_pipe_2-0b1c36a5b284*_pipe_3-52d0eb5a1699)[by: _pipe_2-0b1c36a5b284:[{1}:'key0'] _pipe_3-52d0eb5a1699:[{1}:'key1']]
2017-08-14 16:44:03,659 INFO [pool-270-thread-1] hadoop.FlowReducer (FlowReducer.java:configure(100)) - sinking to: TempHfs["SequenceFile[['key', 'value']]"][1426061066/_pipe_2-0b1c36a5b284__pip/]
2017-08-14 16:44:03,668 INFO [pool-270-thread-1] collect.SpillableTupleList (SpillableTupleList.java:getCodecClass(105)) - attempting to load codec: org.apache.hadoop.io.compress.GzipCodec
2017-08-14 16:44:03,669 INFO [pool-270-thread-1] collect.SpillableTupleList (SpillableTupleList.java:getCodecClass(110)) - found codec: org.apache.hadoop.io.compress.GzipCodec
2017-08-14 16:44:08,207 WARN [pool-267-thread-2] memory.SmoothedMemoryEstimatorWithData (Estimator.scala:estimate(53)) - No matching history found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.HipJob: HadoopFlowStep[name: (1/3)]HadoopFlowStep[name: (2/3)]HadoopFlowStep[name: (3/3) output],Buffer(HadoopFlowStep[name: (1/3)]),HadoopFlowStep[name: (2/3)])
2017-08-14 16:44:08,208 INFO [pool-267-thread-2] memory.MemoryEstimatorStepStrategy$ (MemoryEstimatorStepStrategy.scala:apply(71)) - Memory estimators didn't calculate any value. Skipping setting memory overrides
2017-08-14 16:44:08,208 INFO [pool-267-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] starting step: (2/3)
2017-08-14 16:44:08,852 INFO [pool-267-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] submitted hadoop job: job_1502743403253_0001
2017-08-14 16:44:08,852 INFO [pool-267-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] tracking url: http://MM-MAC-3270:53250/proxy/application_1502743403253_0001/
2017-08-14 16:44:13,858 WARN [pool-267-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] hadoop job job_1502743403253_0001 state at FAILED
2017-08-14 16:44:13,859 WARN [pool-267-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] failure info: Application application_1502743403253_0001 failed 2 times due to AM Container for appattempt_1502743403253_0001_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:44:13,859 WARN [pool-267-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] task completion events identify failed tasks
2017-08-14 16:44:13,859 WARN [pool-267-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] task completion events count: 0
2017-08-14 16:44:13,863 WARN [pool-267-thread-3] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.r...] abandoning step: (3/3) output, predecessor failed: (2/3)
2017-08-14 16:44:13,864 INFO [pool-267-thread-3] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (3/3) output
2017-08-14 16:44:13,867 INFO [pool-267-thread-3] memory.SmoothedMemoryEstimatorWithData (Estimator.scala:estimate(56)) - 2 history entries found for FlowStrategyInfo(com.twitter.scalding.reducer_estimation.HipJob: HadoopFlowStep[name: (1/3)]HadoopFlowStep[name: (2/3)]HadoopFlowStep[name: (3/3) output],Buffer(HadoopFlowStep[name: (2/3)]),HadoopFlowStep[name: (3/3) output])
2017-08-14 16:44:13,867 INFO [pool-267-thread-3] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: Some(536870912) reduce: None
2017-08-14 16:44:13,867 INFO [pool-267-thread-3] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:com$twitter$scalding$estimation$memory$SmoothedHistoryMemoryEstimator$$historyMemory(104)) - Calculated max committed heap for job: null, map: None reduce: Some(536870912)
2017-08-14 16:44:13,867 INFO [pool-267-thread-3] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(536870912) smoothAvg = 5.36870912E8, scaled: 6.442450944E8
2017-08-14 16:44:13,867 INFO [pool-267-thread-3] memory.SmoothedMemoryEstimatorWithData (SmoothedHistoryMemoryEstimator.scala:xmxMemory(58)) - Calculated xmx memory for: List(536870912) smoothAvg = 5.36870912E8, scaled: 6.442450944E8
2017-08-14 16:44:13,867 INFO [pool-267-thread-3] memory.SmoothedMemoryEstimatorWithData (Estimator.scala:estimate(58)) - class com.twitter.scalding.estimation.memory.SmoothedMemoryEstimatorWithData estimate: Some(MemoryEstimate(Some((819,1024)),Some((819,1024))))
2017-08-14 16:44:13,867 INFO [pool-267-thread-3] memory.MemoryEstimatorStepStrategy$ (MemoryEstimatorStepStrategy.scala:apply(62)) - Overriding map memory to: (819,1024) in Mb and reduce memory to: (819,1024) in Mb
2017-08-14 16:44:13,867 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] stopping all jobs
2017-08-14 16:44:13,869 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (3/3) output
2017-08-14 16:44:13,869 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (2/3)
2017-08-14 16:44:13,872 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.r...] stopping: (1/3)
2017-08-14 16:44:13,874 INFO [flow com.twitter.scalding.reducer_estimation.HipJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.r...] stopped all jobs
[info] - should run with correct number of memory in each step *** FAILED ***
[info] cascading.flow.FlowException: step failed: (2/3), with job id: job_1502743403253_0001, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:44:43,075 ERROR [ResourceManager Event Processor] resourcemanager.ResourceManager (ResourceManager.java:run(594)) - Returning, interrupted : java.lang.InterruptedException
2017-08-14 16:44:43,076 ERROR [Thread[Thread-14717,5,main]] delegation.AbstractDelegationTokenSecretManager (AbstractDelegationTokenSecretManager.java:run(552)) - ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted
2017-08-14 16:44:43,083 ERROR [Thread[Thread-14696,5,main]] delegation.AbstractDelegationTokenSecretManager (AbstractDelegationTokenSecretManager.java:run(552)) - ExpiredTokenRemover received java.lang.InterruptedException: sleep interrupted
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=384m; support was removed in 8.0
[info] PlatformTest:
Formatting using clusterid: testClusterID
Aug 14, 2017 4:45:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices as a root resource class
Aug 14, 2017 4:45:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:45:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:45:00 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:45:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices to GuiceManagedComponentProvider with the scope "PerRequest"
Aug 14, 2017 4:45:01 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:45:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:45:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices as a root resource class
Aug 14, 2017 4:45:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:45:01 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:45:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.resourcemanager.webapp.RMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:02 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:45:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:45:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:45:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:45:02 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:45:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:02 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:02 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:45:03 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:45:03 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:45:03 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:45:03 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:45:03 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:03 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:03 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:03 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
Aug 14, 2017 4:45:04 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices as a root resource class
Aug 14, 2017 4:45:04 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Aug 14, 2017 4:45:04 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver as a provider class
Aug 14, 2017 4:45:04 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Aug 14, 2017 4:45:04 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:04 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:04 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.server.nodemanager.webapp.NMWebServices to GuiceManagedComponentProvider with the scope "Singleton"
Aug 14, 2017 4:45:04 PM com.google.inject.servlet.GuiceFilter setPipeline
WARNING: Multiple Servlet injectors detected. This is a warning indicating that you have more than one GuiceFilter running in your web application. If this is deliberate, you may safely ignore this message. If this is NOT deliberate however, your application may not work as expected.
[info] An InAndOutTest
2017-08-14 16:45:09,731 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:45:09,734 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:45:09,740 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] property.AppProps (AppProps.java:getAppID(169)) - using app.id: 50549A77CA824ECCAADCBF9A1E470319
2017-08-14 16:45:09,891 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] util.Version (Version.java:printBanner(78)) - Concurrent, Inc - Cascading 2.6.1
2017-08-14 16:45:09,894 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:45:09,895 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/10d3b549-47d8-4e74-8a65-330957f8e6e1"]
2017-08-14 16:45:09,895 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0 | String]]"]["input"]
2017-08-14 16:45:09,896 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:45:09,896 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 1
2017-08-14 16:45:09,896 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 1
2017-08-14 16:45:09,898 INFO [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (1/1) input
2017-08-14 16:45:11,180 INFO [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0001
2017-08-14 16:45:11,180 INFO [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0001/
2017-08-14 16:45:16,191 WARN [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0001 state at FAILED
2017-08-14 16:45:16,192 WARN [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0001 failed 2 times due to AM Container for appattempt_1502743500244_0001_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:45:16,198 WARN [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events identify failed tasks
2017-08-14 16:45:16,198 WARN [pool-49-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events count: 0
2017-08-14 16:45:16,229 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopping all jobs
2017-08-14 16:45:16,231 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/1) input
2017-08-14 16:45:16,233 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopped all jobs
[info] - should reading then writing shouldn't change the data *** FAILED ***
[info] cascading.flow.FlowException: step failed: (1/1) input, with job id: job_1502743500244_0001, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] A TinyJoinAndMergeJob
2017-08-14 16:45:16,351 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:45:16,351 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:45:16,373 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:45:16,374 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/c573bde0-f536-4b0e-84d1-74ed5b08f4a0"]
2017-08-14 16:45:16,374 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0]]"]["input1"]
2017-08-14 16:45:16,375 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:45:16,375 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 1
2017-08-14 16:45:16,375 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 1
2017-08-14 16:45:16,376 INFO [pool-52-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (1/1) input1
2017-08-14 16:45:17,079 INFO [pool-52-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0002
2017-08-14 16:45:17,079 INFO [pool-52-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0002/
2017-08-14 16:45:22,089 WARN [pool-52-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0002 state at FAILED
2017-08-14 16:45:22,090 WARN [pool-52-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0002 failed 2 times due to AM Container for appattempt_1502743500244_0002_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:45:22,090 WARN [pool-52-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events identify failed tasks
2017-08-14 16:45:22,090 WARN [pool-52-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events count: 0
2017-08-14 16:45:22,094 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopping all jobs
2017-08-14 16:45:22,095 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/1) input1
2017-08-14 16:45:22,097 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopped all jobs
[info] - should merge and joinWithTiny shouldn't duplicate data *** FAILED ***
[info] cascading.flow.FlowException: step failed: (1/1) input1, with job id: job_1502743500244_0002, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] A TsvNoCacheJob
2017-08-14 16:45:22,144 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:45:22,145 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:45:22,161 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:45:22,161 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/c573bde0-f536-4b0e-84d1-74ed5b08f4a0"]
2017-08-14 16:45:22,161 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0]]"]["fakeInput"]
2017-08-14 16:45:22,161 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:45:22,161 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 1
2017-08-14 16:45:22,161 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 1
2017-08-14 16:45:22,162 INFO [pool-55-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (1/1) fakeInput
2017-08-14 16:45:22,868 INFO [pool-55-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0003
2017-08-14 16:45:22,868 INFO [pool-55-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0003/
2017-08-14 16:45:27,878 WARN [pool-55-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0003 state at FAILED
2017-08-14 16:45:27,878 WARN [pool-55-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0003 failed 2 times due to AM Container for appattempt_1502743500244_0003_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:45:27,879 WARN [pool-55-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events identify failed tasks
2017-08-14 16:45:27,879 WARN [pool-55-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events count: 0
2017-08-14 16:45:27,883 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopping all jobs
2017-08-14 16:45:27,884 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/1) fakeInput
2017-08-14 16:45:27,886 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopped all jobs
[info] - should Writing to a tsv in a flow shouldn't effect the output *** FAILED ***
[info] cascading.flow.FlowException: step failed: (1/1) fakeInput, with job id: job_1502743500244_0003, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] A multiple group by job
2017-08-14 16:45:27,990 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:45:27,991 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:45:28,004 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:45:28,004 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/c573bde0-f536-4b0e-84d1-74ed5b08f4a0"]
2017-08-14 16:45:28,004 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0 | String]]"]["input"]
2017-08-14 16:45:28,004 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:45:28,004 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 1
2017-08-14 16:45:28,004 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 1
2017-08-14 16:45:28,005 INFO [pool-58-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (1/1) input
2017-08-14 16:45:28,674 INFO [pool-58-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0004
2017-08-14 16:45:28,674 INFO [pool-58-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0004/
2017-08-14 16:45:33,681 WARN [pool-58-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0004 state at FAILED
2017-08-14 16:45:33,682 WARN [pool-58-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0004 failed 2 times due to AM Container for appattempt_1502743500244_0004_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:45:33,682 WARN [pool-58-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events identify failed tasks
2017-08-14 16:45:33,682 WARN [pool-58-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events count: 0
2017-08-14 16:45:33,686 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopping all jobs
2017-08-14 16:45:33,688 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/1) input
2017-08-14 16:45:33,690 INFO [flow com.twitter.scalding.platform.HadoopPlatformJobTest$$anonfun$1$$anon$1] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopped all jobs
[info] - should do some ops and not stamp on each other ordered serializations *** FAILED ***
[info] cascading.flow.FlowException: step failed: (1/1) input, with job id: job_1502743500244_0004, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] A TypedPipeForceToDiskWithDescriptionPipe
2017-08-14 16:45:33,737 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:45:33,737 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:45:33,772 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:45:33,773 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/39bc8663-e85f-4381-ba95-a0b4ce4b2d69"]
2017-08-14 16:45:33,773 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0:1]]"]["output"]
2017-08-14 16:45:33,773 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:45:33,773 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 2
2017-08-14 16:45:33,773 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 2
2017-08-14 16:45:33,775 INFO [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (1/2)
2017-08-14 16:45:34,455 INFO [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0005
2017-08-14 16:45:34,455 INFO [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0005/
2017-08-14 16:45:39,464 WARN [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0005 state at FAILED
2017-08-14 16:45:39,465 WARN [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0005 failed 2 times due to AM Container for appattempt_1502743500244_0005_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:45:39,466 WARN [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events identify failed tasks
2017-08-14 16:45:39,466 WARN [pool-61-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events count: 0
2017-08-14 16:45:39,470 WARN [pool-61-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] abandoning step: (2/2) output, predecessor failed: (1/2)
2017-08-14 16:45:39,470 INFO [pool-61-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (2/2) output
2017-08-14 16:45:39,470 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopping all jobs
2017-08-14 16:45:39,471 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (2/2) output
2017-08-14 16:45:39,472 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/2)
2017-08-14 16:45:39,474 INFO [flow com.twitter.scalding.platform.TypedPipeForceToDiskWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopped all jobs
[info] - should have a custom step name from withDescription *** FAILED ***
[info] cascading.flow.FlowException: step failed: (1/2), with job id: job_1502743500244_0005, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] A limit
2017-08-14 16:45:39,532 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:45:39,532 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:45:39,589 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:45:39,590 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/39bc8663-e85f-4381-ba95-a0b4ce4b2d69"]
2017-08-14 16:45:39,590 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0]]"]["output1"]
2017-08-14 16:45:39,590 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0]]"]["output2"]
2017-08-14 16:45:39,590 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:45:39,590 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 4
2017-08-14 16:45:39,590 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 4
2017-08-14 16:45:39,591 INFO [pool-64-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (1/4)
2017-08-14 16:45:40,242 INFO [pool-64-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0006
2017-08-14 16:45:40,242 INFO [pool-64-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0006/
2017-08-14 16:45:41,287 ERROR [DataXceiver for client DFSClient_NONMAPREDUCE_2072969000_3534 at /127.0.0.1:53812 [Sending block BP-1139679289-172.19.80.83-1502743485203:blk_1073741874_1050]] datanode.DataNode (DataXceiver.java:run(252)) - 127.0.0.1:53523:DataXceiver error processing READ_BLOCK operation src: /127.0.0.1:53812 dst: /127.0.0.1:53523
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Replica not found for BP-1139679289-172.19.80.83-1502743485203:blk_1073741874_1050
at org.apache.hadoop.hdfs.server.datanode.BlockSender.getReplica(BlockSender.java:419)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:228)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:45:42,278 ERROR [DataXceiver for client DFSClient_NONMAPREDUCE_1585820955_3638 at /127.0.0.1:53819 [Sending block BP-1139679289-172.19.80.83-1502743485203:blk_1073741874_1050]] datanode.DataNode (DataXceiver.java:run(252)) - 127.0.0.1:53523:DataXceiver error processing READ_BLOCK operation src: /127.0.0.1:53819 dst: /127.0.0.1:53523
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Replica not found for BP-1139679289-172.19.80.83-1502743485203:blk_1073741874_1050
at org.apache.hadoop.hdfs.server.datanode.BlockSender.getReplica(BlockSender.java:419)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:228)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:45:45,249 WARN [pool-64-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0006 state at FAILED
2017-08-14 16:45:45,250 WARN [pool-64-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0006 failed 2 times due to AM Container for appattempt_1502743500244_0006_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:45:45,251 WARN [pool-64-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events identify failed tasks
2017-08-14 16:45:45,251 WARN [pool-64-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events count: 0
2017-08-14 16:45:45,255 WARN [pool-64-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] abandoning step: (2/4), predecessor failed: (1/4)
2017-08-14 16:45:45,255 INFO [pool-64-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (2/4)
2017-08-14 16:45:45,256 WARN [pool-64-thread-3] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] abandoning step: (3/4) output2, predecessor failed: (2/4)
2017-08-14 16:45:45,256 WARN [pool-64-thread-4] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] abandoning step: (4/4) output1, predecessor failed: (2/4)
2017-08-14 16:45:45,256 INFO [pool-64-thread-3] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (3/4) output2
2017-08-14 16:45:45,256 INFO [pool-64-thread-4] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (4/4) output1
2017-08-14 16:45:45,256 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopping all jobs
2017-08-14 16:45:45,257 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (4/4) output1
2017-08-14 16:45:45,258 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (3/4) output2
2017-08-14 16:45:45,259 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (2/4)
2017-08-14 16:45:45,260 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/4)
2017-08-14 16:45:45,263 INFO [flow com.twitter.scalding.platform.GroupedLimitJobWithSteps] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopped all jobs
[info] - should not fan out into consumers *** FAILED ***
[info] cascading.flow.FlowException: step failed: (1/4), with job id: job_1502743500244_0006, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] A TypedPipeJoinWithDescriptionPipe
2017-08-14 16:45:45,318 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:45:45,318 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:45:45,388 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:45:45,389 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/4cf2d2eb-0cec-4cb6-967c-c7c6fdc45580"]
2017-08-14 16:45:45,389 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/352e63cd-9e48-41d9-b9a3-5077246c3878"]
2017-08-14 16:45:45,389 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/1ddde117-3b1c-4eb8-b627-d48ff5f74d81"]
2017-08-14 16:45:45,389 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0:1]]"]["output"]
2017-08-14 16:45:45,389 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:45:45,389 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 1
2017-08-14 16:45:45,389 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 1
2017-08-14 16:45:45,390 INFO [pool-67-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (1/1) output
2017-08-14 16:45:46,003 INFO [pool-67-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0007
2017-08-14 16:45:46,004 INFO [pool-67-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0007/
2017-08-14 16:45:51,011 WARN [pool-67-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0007 state at FAILED
2017-08-14 16:45:51,012 WARN [pool-67-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0007 failed 2 times due to AM Container for appattempt_1502743500244_0007_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:45:51,013 WARN [pool-67-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events identify failed tasks
2017-08-14 16:45:51,013 WARN [pool-67-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events count: 0
2017-08-14 16:45:51,017 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopping all jobs
2017-08-14 16:45:51,019 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/1) output
2017-08-14 16:45:51,021 INFO [flow com.twitter.scalding.platform.TypedPipeJoinWithDescriptionJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopped all jobs
[info] - should have a custom step name from withDescription and no extra forceToDisk steps on hashJoin's rhs *** FAILED ***
[info] cascading.flow.FlowException: step failed: (1/1) output, with job id: job_1502743500244_0007, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] A TypedPipeHashJoinWithForceToDiskJob
2017-08-14 16:45:51,049 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:45:51,049 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:45:51,069 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:45:51,070 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/4cf2d2eb-0cec-4cb6-967c-c7c6fdc45580"]
2017-08-14 16:45:51,070 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/1ddde117-3b1c-4eb8-b627-d48ff5f74d81"]
2017-08-14 16:45:51,070 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0:1]]"]["output"]
2017-08-14 16:45:51,070 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:45:51,070 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 2
2017-08-14 16:45:51,070 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 2
2017-08-14 16:45:51,071 INFO [pool-70-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (2/2)
2017-08-14 16:45:51,746 INFO [pool-70-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0008
2017-08-14 16:45:51,746 INFO [pool-70-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0008/
2017-08-14 16:45:53,322 ERROR [DataXceiver for client DFSClient_NONMAPREDUCE_1461010635_4128 at /127.0.0.1:53870 [Sending block BP-1139679289-172.19.80.83-1502743485203:blk_1073741882_1058]] datanode.DataNode (DataXceiver.java:run(252)) - 127.0.0.1:53531:DataXceiver error processing READ_BLOCK operation src: /127.0.0.1:53870 dst: /127.0.0.1:53531
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Replica not found for BP-1139679289-172.19.80.83-1502743485203:blk_1073741882_1058
at org.apache.hadoop.hdfs.server.datanode.BlockSender.getReplica(BlockSender.java:419)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:228)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:45:53,323 ERROR [DataXceiver for client DFSClient_NONMAPREDUCE_1461010635_4128 at /127.0.0.1:53875 [Sending block BP-1139679289-172.19.80.83-1502743485203:blk_1073741882_1058]] datanode.DataNode (DataXceiver.java:run(252)) - 127.0.0.1:53531:DataXceiver error processing READ_BLOCK operation src: /127.0.0.1:53875 dst: /127.0.0.1:53531
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Replica not found for BP-1139679289-172.19.80.83-1502743485203:blk_1073741882_1058
at org.apache.hadoop.hdfs.server.datanode.BlockSender.getReplica(BlockSender.java:419)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:228)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:45:56,751 WARN [pool-70-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0008 state at FAILED
2017-08-14 16:45:56,752 WARN [pool-70-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0008 failed 2 times due to AM Container for appattempt_1502743500244_0008_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:45:56,753 WARN [pool-70-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events identify failed tasks
2017-08-14 16:45:56,753 WARN [pool-70-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events count: 0
2017-08-14 16:45:56,758 WARN [pool-70-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] abandoning step: (1/2) output, predecessor failed: (2/2)
2017-08-14 16:45:56,759 INFO [pool-70-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/2) output
2017-08-14 16:45:56,759 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopping all jobs
2017-08-14 16:45:56,760 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/2) output
2017-08-14 16:45:56,760 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (2/2)
2017-08-14 16:45:56,762 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopped all jobs
[info] - should have a custom step name from withDescription and only one user provided forceToDisk on hashJoin's rhs *** FAILED ***
[info] cascading.flow.FlowException: step failed: (2/2), with job id: job_1502743500244_0008, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] A TypedPipeHashJoinWithForceToDiskFilterJob
2017-08-14 16:45:56,794 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:45:56,794 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:45:56,818 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:45:56,818 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/4cf2d2eb-0cec-4cb6-967c-c7c6fdc45580"]
2017-08-14 16:45:56,818 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/1ddde117-3b1c-4eb8-b627-d48ff5f74d81"]
2017-08-14 16:45:56,818 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0:1]]"]["output"]
2017-08-14 16:45:56,818 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:45:56,818 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 3
2017-08-14 16:45:56,818 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 3
2017-08-14 16:45:56,819 INFO [pool-73-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (2/3)
2017-08-14 16:45:57,495 INFO [pool-73-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0009
2017-08-14 16:45:57,495 INFO [pool-73-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0009/
2017-08-14 16:45:59,343 ERROR [DataXceiver for client DFSClient_NONMAPREDUCE_-2060667957_4370 at /127.0.0.1:53895 [Sending block BP-1139679289-172.19.80.83-1502743485203:blk_1073741886_1062]] datanode.DataNode (DataXceiver.java:run(252)) - 127.0.0.1:53531:DataXceiver error processing READ_BLOCK operation src: /127.0.0.1:53895 dst: /127.0.0.1:53531
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Replica not found for BP-1139679289-172.19.80.83-1502743485203:blk_1073741886_1062
at org.apache.hadoop.hdfs.server.datanode.BlockSender.getReplica(BlockSender.java:419)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:228)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:45:59,344 ERROR [DataXceiver for client DFSClient_NONMAPREDUCE_-2060667957_4370 at /127.0.0.1:53900 [Sending block BP-1139679289-172.19.80.83-1502743485203:blk_1073741886_1062]] datanode.DataNode (DataXceiver.java:run(252)) - 127.0.0.1:53531:DataXceiver error processing READ_BLOCK operation src: /127.0.0.1:53900 dst: /127.0.0.1:53531
org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Replica not found for BP-1139679289-172.19.80.83-1502743485203:blk_1073741886_1062
at org.apache.hadoop.hdfs.server.datanode.BlockSender.getReplica(BlockSender.java:419)
at org.apache.hadoop.hdfs.server.datanode.BlockSender.<init>(BlockSender.java:228)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:495)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:110)
at org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:68)
at org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:234)
at java.lang.Thread.run(Thread.java:748)
2017-08-14 16:46:02,504 WARN [pool-73-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0009 state at FAILED
2017-08-14 16:46:02,504 WARN [pool-73-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0009 failed 2 times due to AM Container for appattempt_1502743500244_0009_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 127
.Failing this attempt.. Failing the application.
2017-08-14 16:46:02,505 WARN [pool-73-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events identify failed tasks
2017-08-14 16:46:02,505 WARN [pool-73-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] task completion events count: 0
2017-08-14 16:46:02,509 WARN [pool-73-thread-2] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] abandoning step: (3/3), predecessor failed: (2/3)
2017-08-14 16:46:02,510 INFO [pool-73-thread-2] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (3/3)
2017-08-14 16:46:02,510 WARN [pool-73-thread-3] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] abandoning step: (1/3) output, predecessor failed: (3/3)
2017-08-14 16:46:02,510 INFO [pool-73-thread-3] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/3) output
2017-08-14 16:46:02,510 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopping all jobs
2017-08-14 16:46:02,511 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (1/3) output
2017-08-14 16:46:02,512 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (3/3)
2017-08-14 16:46:02,513 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] stopping: (2/3)
2017-08-14 16:46:02,515 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskFilterJob] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] stopped all jobs
[info] - should have a custom step name from withDescription and an extra forceToDisk due to a filter operation on hashJoin's rhs *** FAILED ***
[info] cascading.flow.FlowException: step failed: (2/3), with job id: job_1502743500244_0009, please see cluster logs for failure messages
[info] at cascading.flow.planner.FlowStepJob.blockOnJob(FlowStepJob.java:232)
[info] at cascading.flow.planner.FlowStepJob.start(FlowStepJob.java:150)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:124)
[info] at cascading.flow.planner.FlowStepJob.call(FlowStepJob.java:43)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:748)
[info] A TypedPipeHashJoinWithForceToDiskWithComplete
2017-08-14 16:46:02,546 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] util.HadoopUtil (HadoopUtil.java:findMainClass(336)) - using default application jar, may cause class not found exceptions on the cluster
2017-08-14 16:46:02,547 INFO [pool-1-thread-1-ScalaTest-running-PlatformTest] planner.HadoopPlanner (HadoopPlanner.java:initialize(225)) - using application jar: /Users/geri/.ivy2/cache/cascading/cascading-hadoop/jars/cascading-hadoop-2.6.1.jar
2017-08-14 16:46:02,568 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskWithComplete] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting
2017-08-14 16:46:02,569 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskWithComplete] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/4cf2d2eb-0cec-4cb6-967c-c7c6fdc45580"]
2017-08-14 16:46:02,569 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskWithComplete] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] source: MemorySourceTap["MemorySourceScheme[[0]->[ALL]]"]["/1ddde117-3b1c-4eb8-b627-d48ff5f74d81"]
2017-08-14 16:46:02,569 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskWithComplete] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] sink: Hfs["TextDelimited[[0:1]]"]["output"]
2017-08-14 16:46:02,569 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskWithComplete] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] parallel execution is enabled: true
2017-08-14 16:46:02,569 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskWithComplete] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] starting jobs: 2
2017-08-14 16:46:02,569 INFO [flow com.twitter.scalding.platform.TypedPipeHashJoinWithForceToDiskWithComplete] flow.Flow (BaseFlow.java:logInfo(1378)) - [com.twitter.scalding.p...] allocating threads: 2
2017-08-14 16:46:02,570 INFO [pool-76-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] starting step: (2/2)
2017-08-14 16:46:03,247 INFO [pool-76-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] submitted hadoop job: job_1502743500244_0010
2017-08-14 16:46:03,247 INFO [pool-76-thread-1] flow.FlowStep (BaseFlowStep.java:logInfo(834)) - [com.twitter.scalding.p...] tracking url: http://MM-MAC-3270:53558/proxy/application_1502743500244_0010/
2017-08-14 16:46:08,253 WARN [pool-76-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] hadoop job job_1502743500244_0010 state at FAILED
2017-08-14 16:46:08,253 WARN [pool-76-thread-1] flow.FlowStep (BaseFlowStep.java:logWarn(839)) - [com.twitter.scalding.p...] failure info: Application application_1502743500244_0010 failed 2 times due to AM Container for appattempt_1502743500244_0010_000002 exited with exitCode: 127 due to: Exception from container-launch: ExitCodeException exitCode=127:
ExitCodeException exitCode=127:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
at org.apache.hadoop.util.Shell.run(Shell.java:455)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:300)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:81)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment