Skip to content

Instantly share code, notes, and snippets.

@giaosudau
Created August 12, 2016 03:22
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save giaosudau/9011c73f03a6bb4df594f35156845733 to your computer and use it in GitHub Desktop.
Save giaosudau/9011c73f03a6bb4df594f35156845733 to your computer and use it in GitHub Desktop.
2016-08-12T03:17:24,179 INFO [main] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2016-08-12T03:17:24,182 INFO [main] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2016-08-12T03:17:24,210 INFO [main] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.1.3.Final
2016-08-12T03:17:24,664 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, directory='dist/druid/extensions', hadoopDependenciesDir='dist/druid/hadoop-dependencies', hadoopContainerDruidClasspath='null', loadList=[druid-datasketches, druid-avro-extensions, druid-parquet-extensions, postgresql-metadata-storage, druid-hdfs-storage, druid-histogram, druid-datasketches, druid-kafka-indexing-service]}]
2016-08-12T03:17:24,669 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-datasketches] for class [io.druid.cli.CliCommandCreator]
2016-08-12T03:17:24,677 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-datasketches/druid-datasketches-0.9.1.1.jar]
2016-08-12T03:17:24,677 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-datasketches/sketches-core-0.2.2.jar]
2016-08-12T03:17:24,677 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-avro-extensions] for class [io.druid.cli.CliCommandCreator]
2016-08-12T03:17:24,678 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-1.7.7.jar]
2016-08-12T03:17:24,678 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-ipc-1.7.7-tests.jar]
2016-08-12T03:17:24,678 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-ipc-1.7.7.jar]
2016-08-12T03:17:24,678 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-mapred-1.7.7-hadoop2.jar]
2016-08-12T03:17:24,679 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/commons-collections-3.2.1.jar]
2016-08-12T03:17:24,679 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/commons-compress-1.4.1.jar]
2016-08-12T03:17:24,679 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/commons-lang-2.6.jar]
2016-08-12T03:17:24,679 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/druid-avro-extensions-0.9.1.1.jar]
2016-08-12T03:17:24,679 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/gson-2.3.1.jar]
2016-08-12T03:17:24,679 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/guava-16.0.1.jar]
2016-08-12T03:17:24,679 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jackson-core-asl-1.9.13.jar]
2016-08-12T03:17:24,680 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jackson-mapper-asl-1.9.13.jar]
2016-08-12T03:17:24,680 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jersey-client-1.15.jar]
2016-08-12T03:17:24,680 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jersey-core-1.19.jar]
2016-08-12T03:17:24,680 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jetty-6.1.26.jar]
2016-08-12T03:17:24,680 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jetty-util-6.1.26.jar]
2016-08-12T03:17:24,680 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jsr311-api-1.1.1.jar]
2016-08-12T03:17:24,680 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/netty-3.4.0.Final.jar]
2016-08-12T03:17:24,681 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/paranamer-2.3.jar]
2016-08-12T03:17:24,681 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-api-0.1.3.jar]
2016-08-12T03:17:24,681 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-avro-0.1.3.jar]
2016-08-12T03:17:24,681 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-client-0.1.3.jar]
2016-08-12T03:17:24,681 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-common-0.1.3.jar]
2016-08-12T03:17:24,681 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/servlet-api-2.5-20081211.jar]
2016-08-12T03:17:24,681 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/slf4j-api-1.6.4.jar]
2016-08-12T03:17:24,681 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/snappy-java-1.0.5.jar]
2016-08-12T03:17:24,682 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/velocity-1.7.jar]
2016-08-12T03:17:24,682 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/xz-1.0.jar]
2016-08-12T03:17:24,686 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-parquet-extensions] for class [io.druid.cli.CliCommandCreator]
2016-08-12T03:17:24,687 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/druid-parquet-extensions-0.9.1.1.jar]
2016-08-12T03:17:24,687 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-avro-1.8.1.jar]
2016-08-12T03:17:24,687 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-column-1.8.1.jar]
2016-08-12T03:17:24,687 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-common-1.8.1.jar]
2016-08-12T03:17:24,687 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-encoding-1.8.1.jar]
2016-08-12T03:17:24,687 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-format-2.3.1.jar]
2016-08-12T03:17:24,687 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-hadoop-1.8.1.jar]
2016-08-12T03:17:24,687 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-jackson-1.8.1.jar]
2016-08-12T03:17:24,688 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-tools-1.8.1.jar]
2016-08-12T03:17:24,690 INFO [main] io.druid.initialization.Initialization - Loading extension [postgresql-metadata-storage] for class [io.druid.cli.CliCommandCreator]
2016-08-12T03:17:24,690 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/postgresql-metadata-storage/postgresql-9.4.1208.jre7.jar]
2016-08-12T03:17:24,690 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/postgresql-metadata-storage/postgresql-metadata-storage-0.9.1.1.jar]
2016-08-12T03:17:24,691 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-hdfs-storage] for class [io.druid.cli.CliCommandCreator]
2016-08-12T03:17:24,691 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/avro-1.7.4.jar]
2016-08-12T03:17:24,691 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-beanutils-1.7.0.jar]
2016-08-12T03:17:24,692 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-beanutils-core-1.8.0.jar]
2016-08-12T03:17:24,692 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-collections-3.2.1.jar]
2016-08-12T03:17:24,692 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-compress-1.4.1.jar]
2016-08-12T03:17:24,692 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-configuration-1.6.jar]
2016-08-12T03:17:24,692 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-digester-1.8.jar]
2016-08-12T03:17:24,692 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-math3-3.6.1.jar]
2016-08-12T03:17:24,692 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-net-3.1.jar]
2016-08-12T03:17:24,692 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/druid-hdfs-storage-0.9.1.1.jar]
2016-08-12T03:17:24,693 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/guava-16.0.1.jar]
2016-08-12T03:17:24,693 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-auth-2.3.0.jar]
2016-08-12T03:17:24,693 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-client-2.3.0.jar]
2016-08-12T03:17:24,693 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-common-2.3.0.jar]
2016-08-12T03:17:24,693 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-hdfs-2.3.0.jar]
2016-08-12T03:17:24,693 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-app-2.3.0.jar]
2016-08-12T03:17:24,694 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-common-2.3.0.jar]
2016-08-12T03:17:24,694 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-core-2.3.0.jar]
2016-08-12T03:17:24,694 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-jobclient-2.3.0.jar]
2016-08-12T03:17:24,694 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-shuffle-2.3.0.jar]
2016-08-12T03:17:24,694 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-api-2.3.0.jar]
2016-08-12T03:17:24,694 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-client-2.3.0.jar]
2016-08-12T03:17:24,694 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-common-2.3.0.jar]
2016-08-12T03:17:24,695 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-server-common-2.3.0.jar]
2016-08-12T03:17:24,695 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/jaxb-api-2.2.2.jar]
2016-08-12T03:17:24,695 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/paranamer-2.3.jar]
2016-08-12T03:17:24,695 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/servlet-api-2.5.jar]
2016-08-12T03:17:24,695 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/snappy-java-1.0.4.1.jar]
2016-08-12T03:17:24,695 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/stax-api-1.0-2.jar]
2016-08-12T03:17:24,695 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/xmlenc-0.52.jar]
2016-08-12T03:17:24,695 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/xz-1.0.jar]
2016-08-12T03:17:24,701 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-histogram] for class [io.druid.cli.CliCommandCreator]
2016-08-12T03:17:24,702 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-histogram/druid-histogram-0.9.1.1.jar]
2016-08-12T03:17:24,702 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-datasketches] for class [io.druid.cli.CliCommandCreator]
2016-08-12T03:17:24,703 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-kafka-indexing-service] for class [io.druid.cli.CliCommandCreator]
2016-08-12T03:17:24,703 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/druid-kafka-indexing-service-0.9.1.1.jar]
2016-08-12T03:17:24,703 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/kafka-clients-0.9.0.1.jar]
2016-08-12T03:17:24,703 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/lz4-1.3.0.jar]
2016-08-12T03:17:24,703 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/slf4j-api-1.7.6.jar]
2016-08-12T03:17:24,703 INFO [main] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/snappy-java-1.1.1.7.jar]
2016-08-12T03:17:24,851 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-datasketches] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,853 INFO [main] io.druid.initialization.Initialization - Adding local file system extension module [io.druid.query.aggregation.datasketches.theta.SketchModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,854 INFO [main] io.druid.initialization.Initialization - Adding local file system extension module [io.druid.query.aggregation.datasketches.theta.oldapi.OldApiSketchModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,854 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-avro-extensions] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,855 INFO [main] io.druid.initialization.Initialization - Adding local file system extension module [io.druid.data.input.avro.AvroExtensionsModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,855 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-parquet-extensions] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,857 INFO [main] io.druid.initialization.Initialization - Adding local file system extension module [io.druid.data.input.parquet.ParquetExtensionsModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,857 INFO [main] io.druid.initialization.Initialization - Loading extension [postgresql-metadata-storage] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,858 INFO [main] io.druid.initialization.Initialization - Adding local file system extension module [io.druid.metadata.storage.postgresql.PostgreSQLMetadataStorageModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,858 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-hdfs-storage] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,859 INFO [main] io.druid.initialization.Initialization - Adding local file system extension module [io.druid.storage.hdfs.HdfsStorageDruidModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,859 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-histogram] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,861 INFO [main] io.druid.initialization.Initialization - Adding local file system extension module [io.druid.query.aggregation.histogram.ApproximateHistogramDruidModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,861 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-datasketches] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,861 INFO [main] io.druid.initialization.Initialization - Loading extension [druid-kafka-indexing-service] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:24,862 INFO [main] io.druid.initialization.Initialization - Adding local file system extension module [io.druid.indexing.kafka.KafkaIndexTaskModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:25,331 WARN [main] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-08-12T03:17:25,768 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class com.metamx.emitter.core.LoggingEmitterConfig] from props[druid.emitter.logging.] as [LoggingEmitterConfig{loggerClass='com.metamx.emitter.core.LoggingEmitter', logLevel='info'}]
2016-08-12T03:17:25,842 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.DruidMonitorSchedulerConfig] from props[druid.monitoring.] as [io.druid.server.metrics.DruidMonitorSchedulerConfig@11bd803]
2016-08-12T03:17:25,852 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.MonitorsConfig] from props[druid.monitoring.] as [MonitorsConfig{monitors=[class com.metamx.metrics.JvmMonitor]}]
2016-08-12T03:17:25,875 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.DruidNode] from props[druid.] as [DruidNode{serviceName='druid/middlemanager', host='192.168.3.30', port=8100}]
2016-08-12T03:17:26,036 INFO [main] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2016-08-12T03:17:26,037 INFO [main] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2016-08-12T03:17:26,049 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, directory='dist/druid/extensions', hadoopDependenciesDir='dist/druid/hadoop-dependencies', hadoopContainerDruidClasspath='null', loadList=[druid-datasketches, druid-avro-extensions, druid-parquet-extensions, postgresql-metadata-storage, druid-hdfs-storage, druid-histogram, druid-datasketches, druid-kafka-indexing-service]}]
2016-08-12T03:17:26,052 INFO [main] io.druid.server.metrics.MetricsModule - Adding monitor[com.metamx.metrics.JvmMonitor@62d73ead]
2016-08-12T03:17:26,053 INFO [main] io.druid.server.metrics.MetricsModule - Adding monitor[io.druid.query.ExecutorServiceMonitor@46731692]
2016-08-12T03:17:26,053 INFO [main] io.druid.server.metrics.MetricsModule - Adding monitor[io.druid.server.initialization.jetty.JettyServerModule$JettyMonitor@3db663d0]
2016-08-12T03:17:26,063 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.log.StartupLoggingConfig] from props[druid.startup.logging.] as [io.druid.server.log.StartupLoggingConfig@2d0ecb24]
2016-08-12T03:17:26,063 INFO [main] io.druid.cli.CliPeon - Starting up with processors[8], memory[247,988,224].
2016-08-12T03:17:26,064 INFO [main] io.druid.cli.CliPeon - * awt.toolkit: sun.lwawt.macosx.LWCToolkit
2016-08-12T03:17:26,064 INFO [main] io.druid.cli.CliPeon - * druid.emitter: logging
2016-08-12T03:17:26,064 INFO [main] io.druid.cli.CliPeon - * druid.emitter.logging.logLevel: info
2016-08-12T03:17:26,064 INFO [main] io.druid.cli.CliPeon - * druid.extensions.directory: dist/druid/extensions
2016-08-12T03:17:26,064 INFO [main] io.druid.cli.CliPeon - * druid.extensions.hadoopDependenciesDir: dist/druid/hadoop-dependencies
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.extensions.loadList: ["druid-datasketches", "druid-avro-extensions", "druid-parquet-extensions", "postgresql-metadata-storage", "druid-hdfs-storage", "druid-histogram", "druid-datasketches", "druid-kafka-indexing-service"]
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.host: 192.168.3.30
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.indexer.logs.directory: var/druid/indexing-logs
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.indexer.logs.type: file
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.indexer.runner.javaOpts: -server -Xmx2g -Duser.timezone=UTC -Dfile.encoding=UTF-8 -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.indexer.task.baseTaskDir: var/druid/task
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.indexer.task.defaultHadoopCoordinates: ["org.apache.hadoop:hadoop-client:2.3.0"]
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.indexer.task.hadoopWorkingPath: var/druid/hadoop-tmp
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.indexer.task.restoreTasksOnRestart: true
2016-08-12T03:17:26,065 INFO [main] io.druid.cli.CliPeon - * druid.metadata.storage.connector.connectURI: jdbc:derby://localhost:1527/var/druid/metadata.db;create=true
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.metadata.storage.connector.host: localhost
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.metadata.storage.connector.port: 1527
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.metadata.storage.type: derby
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.metrics.emitter.dimension.dataSource: no_metrics
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.metrics.emitter.dimension.taskId: index_hadoop_no_metrics_2016-08-12T03:17:23.355Z
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.monitoring.monitors: ["com.metamx.metrics.JvmMonitor"]
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.port: 8100
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.processing.buffer.sizeBytes: 256000000
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.processing.numThreads: 2
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.selectors.coordinator.serviceName: druid/coordinator
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.selectors.indexing.serviceName: druid/overlord
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.server.http.numThreads: 40
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.service: druid/middlemanager
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.startup.logging.logProperties: true
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.storage.storageDirectory: var/druid/segments
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.storage.type: local
2016-08-12T03:17:26,066 INFO [main] io.druid.cli.CliPeon - * druid.worker.capacity: 3
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * druid.zk.paths.base: /druid
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * druid.zk.service.host: localhost
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * file.encoding: UTF-8
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * file.encoding.pkg: sun.io
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * file.separator: /
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * ftp.nonProxyHosts: local|*.local|169.254/16|*.169.254/16
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * gopherProxySet: false
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * hadoop.hadoop.tmp.dir: var/hadoop-tmp
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * http.nonProxyHosts: local|*.local|169.254/16|*.169.254/16
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * java.awt.graphicsenv: sun.awt.CGraphicsEnvironment
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * java.awt.printerjob: sun.lwawt.macosx.CPrinterJob
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * java.class.path: /Users/giaosudau/Downloads/imply-1.3.0/conf-quickstart/druid/_common:/Users/giaosudau/Downloads/imply-1.3.0/conf-quickstart/druid/middleManager:dist/druid/lib/activation-1.1.1.jar:dist/druid/lib/aether-api-0.9.0.M2.jar:dist/druid/lib/aether-connector-file-0.9.0.M2.jar:dist/druid/lib/aether-connector-okhttp-0.0.9.jar:dist/druid/lib/aether-impl-0.9.0.M2.jar:dist/druid/lib/aether-spi-0.9.0.M2.jar:dist/druid/lib/aether-util-0.9.0.M2.jar:dist/druid/lib/airline-0.7.jar:dist/druid/lib/annotations-2.0.3.jar:dist/druid/lib/antlr4-runtime-4.5.1.jar:dist/druid/lib/aopalliance-1.0.jar:dist/druid/lib/aws-java-sdk-1.10.21.jar:dist/druid/lib/aws-java-sdk-autoscaling-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudformation-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudfront-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudhsm-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudsearch-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudtrail-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudwatch-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudwatchmetrics-1.10.21.jar:dist/druid/lib/aws-java-sdk-codecommit-1.10.21.jar:dist/druid/lib/aws-java-sdk-codedeploy-1.10.21.jar:dist/druid/lib/aws-java-sdk-codepipeline-1.10.21.jar:dist/druid/lib/aws-java-sdk-cognitoidentity-1.10.21.jar:dist/druid/lib/aws-java-sdk-cognitosync-1.10.21.jar:dist/druid/lib/aws-java-sdk-config-1.10.21.jar:dist/druid/lib/aws-java-sdk-core-1.10.21.jar:dist/druid/lib/aws-java-sdk-datapipeline-1.10.21.jar:dist/druid/lib/aws-java-sdk-devicefarm-1.10.21.jar:dist/druid/lib/aws-java-sdk-directconnect-1.10.21.jar:dist/druid/lib/aws-java-sdk-directory-1.10.21.jar:dist/druid/lib/aws-java-sdk-dynamodb-1.10.21.jar:dist/druid/lib/aws-java-sdk-ec2-1.10.21.jar:dist/druid/lib/aws-java-sdk-ecs-1.10.21.jar:dist/druid/lib/aws-java-sdk-efs-1.10.21.jar:dist/druid/lib/aws-java-sdk-elasticache-1.10.21.jar:dist/druid/lib/aws-java-sdk-elasticbeanstalk-1.10.21.jar:dist/druid/lib/aws-java-sdk-elasticloadbalancing-1.10.21.jar:dist/druid/lib/aws-java-sdk-elastictranscoder-1.10.21.jar:dist/druid/lib/aws-java-sdk-emr-1.10.21.jar:dist/druid/lib/aws-java-sdk-glacier-1.10.21.jar:dist/druid/lib/aws-java-sdk-iam-1.10.21.jar:dist/druid/lib/aws-java-sdk-importexport-1.10.21.jar:dist/druid/lib/aws-java-sdk-kinesis-1.10.21.jar:dist/druid/lib/aws-java-sdk-kms-1.10.21.jar:dist/druid/lib/aws-java-sdk-lambda-1.10.21.jar:dist/druid/lib/aws-java-sdk-logs-1.10.21.jar:dist/druid/lib/aws-java-sdk-machinelearning-1.10.21.jar:dist/druid/lib/aws-java-sdk-opsworks-1.10.21.jar:dist/druid/lib/aws-java-sdk-rds-1.10.21.jar:dist/druid/lib/aws-java-sdk-redshift-1.10.21.jar:dist/druid/lib/aws-java-sdk-route53-1.10.21.jar:dist/druid/lib/aws-java-sdk-s3-1.10.21.jar:dist/druid/lib/aws-java-sdk-ses-1.10.21.jar:dist/druid/lib/aws-java-sdk-simpledb-1.10.21.jar:dist/druid/lib/aws-java-sdk-simpleworkflow-1.10.21.jar:dist/druid/lib/aws-java-sdk-sns-1.10.21.jar:dist/druid/lib/aws-java-sdk-sqs-1.10.21.jar:dist/druid/lib/aws-java-sdk-ssm-1.10.21.jar:dist/druid/lib/aws-java-sdk-storagegateway-1.10.21.jar:dist/druid/lib/aws-java-sdk-sts-1.10.21.jar:dist/druid/lib/aws-java-sdk-support-1.10.21.jar:dist/druid/lib/aws-java-sdk-swf-libraries-1.10.21.jar:dist/druid/lib/aws-java-sdk-workspaces-1.10.21.jar:dist/druid/lib/base64-2.3.8.jar:dist/druid/lib/bcprov-jdk15on-1.52.jar:dist/druid/lib/bytebuffer-collections-0.2.4.jar:dist/druid/lib/classmate-1.0.0.jar:dist/druid/lib/commons-cli-1.2.jar:dist/druid/lib/commons-codec-1.7.jar:dist/druid/lib/commons-dbcp2-2.0.1.jar:dist/druid/lib/commons-io-2.4.jar:dist/druid/lib/commons-lang-2.6.jar:dist/druid/lib/commons-logging-1.1.1.jar:dist/druid/lib/commons-math3-3.6.1.jar:dist/druid/lib/commons-pool-1.6.jar:dist/druid/lib/commons-pool2-2.2.jar:dist/druid/lib/compress-lzf-1.0.3.jar:dist/druid/lib/config-magic-0.9.jar:dist/druid/lib/curator-client-2.10.0.jar:dist/druid/lib/curator-framework-2.10.0.jar:dist/druid/lib/curator-recipes-2.10.0.jar:dist/druid/lib/curator-x-discovery-2.10.0.jar:dist/druid/lib/derby-10.11.1.1.jar:dist/druid/lib/derbyclient-10.11.1.1.jar:dist/druid/lib/derbynet-10.11.1.1.jar:dist/druid/lib/derbytools-10.11.1.1.jar:dist/druid/lib/disruptor-3.3.0.jar:dist/druid/lib/druid-api-0.9.1.1.jar:dist/druid/lib/druid-aws-common-0.9.1.1.jar:dist/druid/lib/druid-common-0.9.1.1.jar:dist/druid/lib/druid-console-0.0.2.jar:dist/druid/lib/druid-indexing-hadoop-0.9.1.1.jar:dist/druid/lib/druid-indexing-service-0.9.1.1.jar:dist/druid/lib/druid-processing-0.9.1.1.jar:dist/druid/lib/druid-server-0.9.1.1.jar:dist/druid/lib/druid-services-0.9.1.1.jar:dist/druid/lib/emitter-0.3.6.jar:dist/druid/lib/extendedset-1.3.9.jar:dist/druid/lib/geoip2-0.4.0.jar:dist/druid/lib/google-http-client-jackson2-1.15.0-rc.jar:dist/druid/lib/guava-16.0.1.jar:dist/druid/lib/guice-4.0-beta.jar:dist/druid/lib/guice-multibindings-4.0-beta.jar:dist/druid/lib/guice-servlet-4.0-beta.jar:dist/druid/lib/hibernate-validator-5.1.3.Final.jar:dist/druid/lib/http-client-1.0.4.jar:dist/druid/lib/httpclient-4.5.1.jar:dist/druid/lib/httpcore-4.4.3.jar:dist/druid/lib/icu4j-4.8.1.jar:dist/druid/lib/irc-api-1.0-0014.jar:dist/druid/lib/jackson-annotations-2.4.6.jar:dist/druid/lib/jackson-core-2.4.6.jar:dist/druid/lib/jackson-core-asl-1.9.13.jar:dist/druid/lib/jackson-databind-2.4.6.jar:dist/druid/lib/jackson-dataformat-smile-2.4.6.jar:dist/druid/lib/jackson-datatype-guava-2.4.6.jar:dist/druid/lib/jackson-datatype-joda-2.4.6.jar:dist/druid/lib/jackson-jaxrs-base-2.4.6.jar:dist/druid/lib/jackson-jaxrs-json-provider-2.4.6.jar:dist/druid/lib/jackson-jaxrs-smile-provider-2.4.6.jar:dist/druid/lib/jackson-mapper-asl-1.9.13.jar:dist/druid/lib/jackson-module-jaxb-annotations-2.4.6.jar:dist/druid/lib/java-util-0.27.9.jar:dist/druid/lib/java-xmlbuilder-1.1.jar:dist/druid/lib/javax.el-3.0.0.jar:dist/druid/lib/javax.el-api-3.0.0.jar:dist/druid/lib/javax.inject-1.jar:dist/druid/lib/javax.servlet-api-3.1.0.jar:dist/druid/lib/jboss-logging-3.1.3.GA.jar:dist/druid/lib/jcl-over-slf4j-1.7.12.jar:dist/druid/lib/jdbi-2.63.1.jar:dist/druid/lib/jersey-core-1.19.jar:dist/druid/lib/jersey-guice-1.19.jar:dist/druid/lib/jersey-server-1.19.jar:dist/druid/lib/jersey-servlet-1.19.jar:dist/druid/lib/jets3t-0.9.4.jar:dist/druid/lib/jetty-client-9.2.5.v20141112.jar:dist/druid/lib/jetty-continuation-9.2.5.v20141112.jar:dist/druid/lib/jetty-http-9.2.5.v20141112.jar:dist/druid/lib/jetty-io-9.2.5.v20141112.jar:dist/druid/lib/jetty-proxy-9.2.5.v20141112.jar:dist/druid/lib/jetty-security-9.2.5.v20141112.jar:dist/druid/lib/jetty-server-9.2.5.v20141112.jar:dist/druid/lib/jetty-servlet-9.2.5.v20141112.jar:dist/druid/lib/jetty-servlets-9.2.5.v20141112.jar:dist/druid/lib/jetty-util-9.2.5.v20141112.jar:dist/druid/lib/jline-0.9.94.jar:dist/druid/lib/joda-time-2.8.2.jar:dist/druid/lib/json-path-2.1.0.jar:dist/druid/lib/jsr305-2.0.1.jar:dist/druid/lib/jsr311-api-1.1.1.jar:dist/druid/lib/log4j-1.2-api-2.5.jar:dist/druid/lib/log4j-api-2.5.jar:dist/druid/lib/log4j-core-2.5.jar:dist/druid/lib/log4j-jul-2.5.jar:dist/druid/lib/log4j-slf4j-impl-2.5.jar:dist/druid/lib/lz4-1.3.0.jar:dist/druid/lib/mapdb-1.0.8.jar:dist/druid/lib/maven-aether-provider-3.1.1.jar:dist/druid/lib/maven-model-3.1.1.jar:dist/druid/lib/maven-model-builder-3.1.1.jar:dist/druid/lib/maven-repository-metadata-3.1.1.jar:dist/druid/lib/maven-settings-3.1.1.jar:dist/druid/lib/maven-settings-builder-3.1.1.jar:dist/druid/lib/maxminddb-0.2.0.jar:dist/druid/lib/netty-3.10.4.Final.jar:dist/druid/lib/okhttp-1.0.2.jar:dist/druid/lib/opencsv-2.3.jar:dist/druid/lib/plexus-interpolation-1.19.jar:dist/druid/lib/plexus-utils-3.0.15.jar:dist/druid/lib/protobuf-java-2.5.0.jar:dist/druid/lib/rhino-1.7R5.jar:dist/druid/lib/RoaringBitmap-0.5.16.jar:dist/druid/lib/server-metrics-0.2.8.jar:dist/druid/lib/slf4j-api-1.7.12.jar:dist/druid/lib/spymemcached-2.11.7.jar:dist/druid/lib/tesla-aether-0.0.5.jar:dist/druid/lib/validation-api-1.1.0.Final.jar:dist/druid/lib/wagon-provider-api-2.4.jar:dist/druid/lib/zookeeper-3.4.8.jar
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * java.class.version: 52.0
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * java.endorsed.dirs: /Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib/endorsed
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * java.ext.dirs: /Users/giaosudau/Library/Java/Extensions:/Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib/ext:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * java.home: /Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre
2016-08-12T03:17:26,067 INFO [main] io.druid.cli.CliPeon - * java.io.tmpdir: var/tmp
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.library.path: /Users/giaosudau/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.runtime.name: Java(TM) SE Runtime Environment
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.runtime.version: 1.8.0_77-b03
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.specification.name: Java Platform API Specification
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.specification.vendor: Oracle Corporation
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.specification.version: 1.8
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.util.logging.manager: org.apache.logging.log4j.jul.LogManager
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.vendor: Oracle Corporation
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.vendor.url: http://java.oracle.com/
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.vendor.url.bug: http://bugreport.sun.com/bugreport/
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.version: 1.8.0_77
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.vm.info: mixed mode
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.vm.name: Java HotSpot(TM) 64-Bit Server VM
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.vm.specification.name: Java Virtual Machine Specification
2016-08-12T03:17:26,068 INFO [main] io.druid.cli.CliPeon - * java.vm.specification.vendor: Oracle Corporation
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * java.vm.specification.version: 1.8
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * java.vm.vendor: Oracle Corporation
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * java.vm.version: 25.77-b03
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * line.separator:
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * log4j.shutdownCallbackRegistry: io.druid.common.config.Log4jShutdown
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * log4j.shutdownHookEnabled: true
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * os.arch: x86_64
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * os.name: Mac OS X
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * os.version: 10.11.6
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * path.separator: :
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * socksNonProxyHosts: local|*.local|169.254/16|*.169.254/16
2016-08-12T03:17:26,069 INFO [main] io.druid.cli.CliPeon - * sun.arch.data.model: 64
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.boot.class.path: /Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib/sunrsasign.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/classes
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.boot.library.path: /Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre/lib
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.cpu.endian: little
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.cpu.isalist:
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.io.unicode.encoding: UnicodeBig
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.java.command: io.druid.cli.Main internal peon var/druid/task/index_hadoop_no_metrics_2016-08-12T03:17:23.355Z/task.json var/druid/task/index_hadoop_no_metrics_2016-08-12T03:17:23.355Z/15578f93-672c-4852-a0d3-395574643e21/status.json
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.java.launcher: SUN_STANDARD
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.jnu.encoding: UTF-8
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.management.compiler: HotSpot 64-Bit Tiered Compilers
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * sun.os.patch.level: unknown
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * user.country: US
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * user.country.format: VN
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * user.dir: /Users/giaosudau/Downloads/imply-1.3.0
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * user.home: /Users/giaosudau
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * user.language: en
2016-08-12T03:17:26,070 INFO [main] io.druid.cli.CliPeon - * user.name: giaosudau
2016-08-12T03:17:26,071 INFO [main] io.druid.cli.CliPeon - * user.timezone: UTC
2016-08-12T03:17:26,080 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.config.TaskConfig] from props[druid.indexer.task.] as [io.druid.indexing.common.config.TaskConfig@153cd6bb]
2016-08-12T03:17:26,083 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.http.DruidHttpClientConfig] from props[druid.global.http.] as [io.druid.guice.http.DruidHttpClientConfig@4492eede]
2016-08-12T03:17:26,169 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.indexing.IndexingServiceSelectorConfig] from props[druid.selectors.indexing.] as [io.druid.client.indexing.IndexingServiceSelectorConfig@6438a7fe]
2016-08-12T03:17:26,172 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.curator.CuratorConfig] from props[druid.zk.service.] as [io.druid.curator.CuratorConfig@1640190a]
2016-08-12T03:17:26,176 WARN [main] org.apache.curator.retry.ExponentialBackoffRetry - maxRetries too large (30). Pinning to 29
2016-08-12T03:17:26,196 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.CuratorDiscoveryConfig] from props[druid.discovery.curator.] as [io.druid.server.initialization.CuratorDiscoveryConfig@49ede9c7]
2016-08-12T03:17:26,331 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.indexing.common.RetryPolicyConfig] from props[druid.peon.taskActionClient.retry.] as [io.druid.indexing.common.RetryPolicyConfig@b887730]
2016-08-12T03:17:26,333 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.segment.loading.LocalDataSegmentPusherConfig] from props[druid.storage.] as [io.druid.segment.loading.LocalDataSegmentPusherConfig@17f3eefb]
2016-08-12T03:17:26,334 INFO [main] io.druid.segment.loading.LocalDataSegmentPusher - Configured local filesystem as deep storage
2016-08-12T03:17:26,339 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.DruidServerConfig] from props[druid.server.] as [io.druid.client.DruidServerConfig@2e645fbd]
2016-08-12T03:17:26,342 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from props[druid.announcer.] as [io.druid.server.initialization.BatchDataSegmentAnnouncerConfig@625dfff3]
2016-08-12T03:17:26,345 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ZkPathsConfig] from props[druid.zk.paths.] as [io.druid.server.initialization.ZkPathsConfig@22e2266d]
2016-08-12T03:17:26,350 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.coordination.DataSegmentAnnouncerProvider] from props[druid.announcer.] as [io.druid.server.coordination.BatchDataSegmentAnnouncerProvider@298f0a0b]
2016-08-12T03:17:26,352 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.coordinator.CoordinatorSelectorConfig] from props[druid.selectors.coordinator.] as [io.druid.client.coordinator.CoordinatorSelectorConfig@1947596f]
2016-08-12T03:17:26,354 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.segment.realtime.plumber.CoordinatorBasedSegmentHandoffNotifierConfig] from props[druid.segment.handoff.] as [io.druid.segment.realtime.plumber.CoordinatorBasedSegmentHandoffNotifierConfig@47be0f9b]
2016-08-12T03:17:26,358 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [256000000] for [druid.processing.buffer.sizeBytes] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2016-08-12T03:17:26,359 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [2] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2016-08-12T03:17:26,359 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.computation.buffer.poolCacheMaxCount, ${base_path}.buffer.poolCacheMaxCount] on [io.druid.query.DruidProcessingConfig#poolCacheMaxCount()]
2016-08-12T03:17:26,360 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.fifo] on [io.druid.query.DruidProcessingConfig#isFifo()]
2016-08-12T03:17:26,360 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2016-08-12T03:17:26,360 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2016-08-12T03:17:26,433 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.search.search.SearchQueryConfig] from props[druid.query.search.] as [io.druid.query.search.search.SearchQueryConfig@6fc6deb7]
2016-08-12T03:17:26,435 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.metadata.SegmentMetadataQueryConfig] from props[druid.query.segmentMetadata.] as [io.druid.query.metadata.SegmentMetadataQueryConfig@34b9eb03]
2016-08-12T03:17:26,437 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.groupby.GroupByQueryConfig] from props[druid.query.groupBy.] as [io.druid.query.groupby.GroupByQueryConfig@661d88a]
2016-08-12T03:17:26,451 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.topn.TopNQueryConfig] from props[druid.query.topN.] as [io.druid.query.topn.TopNQueryConfig@7b6c6e70]
2016-08-12T03:17:26,456 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.client.cache.CacheProvider] from props[druid.cache.] as [io.druid.client.cache.LocalCacheProvider@3ac3f6f]
2016-08-12T03:17:26,462 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.cache.CacheConfig] from props[druid.realtime.cache.] as [io.druid.client.cache.CacheConfig@5f2de715]
2016-08-12T03:17:26,476 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ServerConfig] from props[druid.server.http.] as [ServerConfig{numThreads=40, maxIdleTime=PT5M}]
2016-08-12T03:17:26,477 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[interface io.druid.server.log.RequestLoggerProvider] from props[druid.request.logging.] as [io.druid.server.log.NoopRequestLoggerProvider@36c281ed]
2016-08-12T03:17:26,479 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.security.AuthConfig] from props[druid.auth.] as [AuthConfig{enabled=false}]
2016-08-12T03:17:26,490 INFO [main] org.eclipse.jetty.util.log - Logging initialized @3082ms
2016-08-12T03:17:26,573 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.lookup.LookupConfig] from props[druid.lookup.] as [LookupConfig{snapshotWorkingDir=''}]
2016-08-12T03:17:26,577 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.lookup.LookupListeningAnnouncerConfig] from props[druid.lookup.] as [ListeningAnnouncerConfig{listenersPath='/druid/listeners'}]
2016-08-12T03:17:26,578 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.core.LoggingEmitter.start()] on object[com.metamx.emitter.core.LoggingEmitter@28757abd].
2016-08-12T03:17:26,579 INFO [main] com.metamx.emitter.core.LoggingEmitter - Start: started [true]
2016-08-12T03:17:26,579 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner.start()] on object[io.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner@138aa3cc].
2016-08-12T03:17:26,579 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.emitter.service.ServiceEmitter.start()] on object[com.metamx.emitter.service.ServiceEmitter@10a98392].
2016-08-12T03:17:26,579 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.metrics.MonitorScheduler.start()] on object[com.metamx.metrics.MonitorScheduler@30839e44].
2016-08-12T03:17:26,582 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void com.metamx.http.client.NettyHttpClient.start()] on object[com.metamx.http.client.NettyHttpClient@5afbd567].
2016-08-12T03:17:26,583 INFO [main] io.druid.curator.CuratorModule - Starting Curator
2016-08-12T03:17:26,583 INFO [main] org.apache.curator.framework.imps.CuratorFrameworkImpl - Starting
2016-08-12T03:17:26,592 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.8--1, built on 02/06/2016 03:18 GMT
2016-08-12T03:17:26,592 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:host.name=192.168.3.30
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_77
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.home=/Library/Java/JavaVirtualMachines/jdk1.8.0_77.jdk/Contents/Home/jre
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=/Users/giaosudau/Downloads/imply-1.3.0/conf-quickstart/druid/_common:/Users/giaosudau/Downloads/imply-1.3.0/conf-quickstart/druid/middleManager:dist/druid/lib/activation-1.1.1.jar:dist/druid/lib/aether-api-0.9.0.M2.jar:dist/druid/lib/aether-connector-file-0.9.0.M2.jar:dist/druid/lib/aether-connector-okhttp-0.0.9.jar:dist/druid/lib/aether-impl-0.9.0.M2.jar:dist/druid/lib/aether-spi-0.9.0.M2.jar:dist/druid/lib/aether-util-0.9.0.M2.jar:dist/druid/lib/airline-0.7.jar:dist/druid/lib/annotations-2.0.3.jar:dist/druid/lib/antlr4-runtime-4.5.1.jar:dist/druid/lib/aopalliance-1.0.jar:dist/druid/lib/aws-java-sdk-1.10.21.jar:dist/druid/lib/aws-java-sdk-autoscaling-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudformation-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudfront-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudhsm-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudsearch-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudtrail-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudwatch-1.10.21.jar:dist/druid/lib/aws-java-sdk-cloudwatchmetrics-1.10.21.jar:dist/druid/lib/aws-java-sdk-codecommit-1.10.21.jar:dist/druid/lib/aws-java-sdk-codedeploy-1.10.21.jar:dist/druid/lib/aws-java-sdk-codepipeline-1.10.21.jar:dist/druid/lib/aws-java-sdk-cognitoidentity-1.10.21.jar:dist/druid/lib/aws-java-sdk-cognitosync-1.10.21.jar:dist/druid/lib/aws-java-sdk-config-1.10.21.jar:dist/druid/lib/aws-java-sdk-core-1.10.21.jar:dist/druid/lib/aws-java-sdk-datapipeline-1.10.21.jar:dist/druid/lib/aws-java-sdk-devicefarm-1.10.21.jar:dist/druid/lib/aws-java-sdk-directconnect-1.10.21.jar:dist/druid/lib/aws-java-sdk-directory-1.10.21.jar:dist/druid/lib/aws-java-sdk-dynamodb-1.10.21.jar:dist/druid/lib/aws-java-sdk-ec2-1.10.21.jar:dist/druid/lib/aws-java-sdk-ecs-1.10.21.jar:dist/druid/lib/aws-java-sdk-efs-1.10.21.jar:dist/druid/lib/aws-java-sdk-elasticache-1.10.21.jar:dist/druid/lib/aws-java-sdk-elasticbeanstalk-1.10.21.jar:dist/druid/lib/aws-java-sdk-elasticloadbalancing-1.10.21.jar:dist/druid/lib/aws-java-sdk-elastictranscoder-1.10.21.jar:dist/druid/lib/aws-java-sdk-emr-1.10.21.jar:dist/druid/lib/aws-java-sdk-glacier-1.10.21.jar:dist/druid/lib/aws-java-sdk-iam-1.10.21.jar:dist/druid/lib/aws-java-sdk-importexport-1.10.21.jar:dist/druid/lib/aws-java-sdk-kinesis-1.10.21.jar:dist/druid/lib/aws-java-sdk-kms-1.10.21.jar:dist/druid/lib/aws-java-sdk-lambda-1.10.21.jar:dist/druid/lib/aws-java-sdk-logs-1.10.21.jar:dist/druid/lib/aws-java-sdk-machinelearning-1.10.21.jar:dist/druid/lib/aws-java-sdk-opsworks-1.10.21.jar:dist/druid/lib/aws-java-sdk-rds-1.10.21.jar:dist/druid/lib/aws-java-sdk-redshift-1.10.21.jar:dist/druid/lib/aws-java-sdk-route53-1.10.21.jar:dist/druid/lib/aws-java-sdk-s3-1.10.21.jar:dist/druid/lib/aws-java-sdk-ses-1.10.21.jar:dist/druid/lib/aws-java-sdk-simpledb-1.10.21.jar:dist/druid/lib/aws-java-sdk-simpleworkflow-1.10.21.jar:dist/druid/lib/aws-java-sdk-sns-1.10.21.jar:dist/druid/lib/aws-java-sdk-sqs-1.10.21.jar:dist/druid/lib/aws-java-sdk-ssm-1.10.21.jar:dist/druid/lib/aws-java-sdk-storagegateway-1.10.21.jar:dist/druid/lib/aws-java-sdk-sts-1.10.21.jar:dist/druid/lib/aws-java-sdk-support-1.10.21.jar:dist/druid/lib/aws-java-sdk-swf-libraries-1.10.21.jar:dist/druid/lib/aws-java-sdk-workspaces-1.10.21.jar:dist/druid/lib/base64-2.3.8.jar:dist/druid/lib/bcprov-jdk15on-1.52.jar:dist/druid/lib/bytebuffer-collections-0.2.4.jar:dist/druid/lib/classmate-1.0.0.jar:dist/druid/lib/commons-cli-1.2.jar:dist/druid/lib/commons-codec-1.7.jar:dist/druid/lib/commons-dbcp2-2.0.1.jar:dist/druid/lib/commons-io-2.4.jar:dist/druid/lib/commons-lang-2.6.jar:dist/druid/lib/commons-logging-1.1.1.jar:dist/druid/lib/commons-math3-3.6.1.jar:dist/druid/lib/commons-pool-1.6.jar:dist/druid/lib/commons-pool2-2.2.jar:dist/druid/lib/compress-lzf-1.0.3.jar:dist/druid/lib/config-magic-0.9.jar:dist/druid/lib/curator-client-2.10.0.jar:dist/druid/lib/curator-framework-2.10.0.jar:dist/druid/lib/curator-recipes-2.10.0.jar:dist/druid/lib/curator-x-discovery-2.10.0.jar:dist/druid/lib/derby-10.11.1.1.jar:dist/druid/lib/derbyclient-10.11.1.1.jar:dist/druid/lib/derbynet-10.11.1.1.jar:dist/druid/lib/derbytools-10.11.1.1.jar:dist/druid/lib/disruptor-3.3.0.jar:dist/druid/lib/druid-api-0.9.1.1.jar:dist/druid/lib/druid-aws-common-0.9.1.1.jar:dist/druid/lib/druid-common-0.9.1.1.jar:dist/druid/lib/druid-console-0.0.2.jar:dist/druid/lib/druid-indexing-hadoop-0.9.1.1.jar:dist/druid/lib/druid-indexing-service-0.9.1.1.jar:dist/druid/lib/druid-processing-0.9.1.1.jar:dist/druid/lib/druid-server-0.9.1.1.jar:dist/druid/lib/druid-services-0.9.1.1.jar:dist/druid/lib/emitter-0.3.6.jar:dist/druid/lib/extendedset-1.3.9.jar:dist/druid/lib/geoip2-0.4.0.jar:dist/druid/lib/google-http-client-jackson2-1.15.0-rc.jar:dist/druid/lib/guava-16.0.1.jar:dist/druid/lib/guice-4.0-beta.jar:dist/druid/lib/guice-multibindings-4.0-beta.jar:dist/druid/lib/guice-servlet-4.0-beta.jar:dist/druid/lib/hibernate-validator-5.1.3.Final.jar:dist/druid/lib/http-client-1.0.4.jar:dist/druid/lib/httpclient-4.5.1.jar:dist/druid/lib/httpcore-4.4.3.jar:dist/druid/lib/icu4j-4.8.1.jar:dist/druid/lib/irc-api-1.0-0014.jar:dist/druid/lib/jackson-annotations-2.4.6.jar:dist/druid/lib/jackson-core-2.4.6.jar:dist/druid/lib/jackson-core-asl-1.9.13.jar:dist/druid/lib/jackson-databind-2.4.6.jar:dist/druid/lib/jackson-dataformat-smile-2.4.6.jar:dist/druid/lib/jackson-datatype-guava-2.4.6.jar:dist/druid/lib/jackson-datatype-joda-2.4.6.jar:dist/druid/lib/jackson-jaxrs-base-2.4.6.jar:dist/druid/lib/jackson-jaxrs-json-provider-2.4.6.jar:dist/druid/lib/jackson-jaxrs-smile-provider-2.4.6.jar:dist/druid/lib/jackson-mapper-asl-1.9.13.jar:dist/druid/lib/jackson-module-jaxb-annotations-2.4.6.jar:dist/druid/lib/java-util-0.27.9.jar:dist/druid/lib/java-xmlbuilder-1.1.jar:dist/druid/lib/javax.el-3.0.0.jar:dist/druid/lib/javax.el-api-3.0.0.jar:dist/druid/lib/javax.inject-1.jar:dist/druid/lib/javax.servlet-api-3.1.0.jar:dist/druid/lib/jboss-logging-3.1.3.GA.jar:dist/druid/lib/jcl-over-slf4j-1.7.12.jar:dist/druid/lib/jdbi-2.63.1.jar:dist/druid/lib/jersey-core-1.19.jar:dist/druid/lib/jersey-guice-1.19.jar:dist/druid/lib/jersey-server-1.19.jar:dist/druid/lib/jersey-servlet-1.19.jar:dist/druid/lib/jets3t-0.9.4.jar:dist/druid/lib/jetty-client-9.2.5.v20141112.jar:dist/druid/lib/jetty-continuation-9.2.5.v20141112.jar:dist/druid/lib/jetty-http-9.2.5.v20141112.jar:dist/druid/lib/jetty-io-9.2.5.v20141112.jar:dist/druid/lib/jetty-proxy-9.2.5.v20141112.jar:dist/druid/lib/jetty-security-9.2.5.v20141112.jar:dist/druid/lib/jetty-server-9.2.5.v20141112.jar:dist/druid/lib/jetty-servlet-9.2.5.v20141112.jar:dist/druid/lib/jetty-servlets-9.2.5.v20141112.jar:dist/druid/lib/jetty-util-9.2.5.v20141112.jar:dist/druid/lib/jline-0.9.94.jar:dist/druid/lib/joda-time-2.8.2.jar:dist/druid/lib/json-path-2.1.0.jar:dist/druid/lib/jsr305-2.0.1.jar:dist/druid/lib/jsr311-api-1.1.1.jar:dist/druid/lib/log4j-1.2-api-2.5.jar:dist/druid/lib/log4j-api-2.5.jar:dist/druid/lib/log4j-core-2.5.jar:dist/druid/lib/log4j-jul-2.5.jar:dist/druid/lib/log4j-slf4j-impl-2.5.jar:dist/druid/lib/lz4-1.3.0.jar:dist/druid/lib/mapdb-1.0.8.jar:dist/druid/lib/maven-aether-provider-3.1.1.jar:dist/druid/lib/maven-model-3.1.1.jar:dist/druid/lib/maven-model-builder-3.1.1.jar:dist/druid/lib/maven-repository-metadata-3.1.1.jar:dist/druid/lib/maven-settings-3.1.1.jar:dist/druid/lib/maven-settings-builder-3.1.1.jar:dist/druid/lib/maxminddb-0.2.0.jar:dist/druid/lib/netty-3.10.4.Final.jar:dist/druid/lib/okhttp-1.0.2.jar:dist/druid/lib/opencsv-2.3.jar:dist/druid/lib/plexus-interpolation-1.19.jar:dist/druid/lib/plexus-utils-3.0.15.jar:dist/druid/lib/protobuf-java-2.5.0.jar:dist/druid/lib/rhino-1.7R5.jar:dist/druid/lib/RoaringBitmap-0.5.16.jar:dist/druid/lib/server-metrics-0.2.8.jar:dist/druid/lib/slf4j-api-1.7.12.jar:dist/druid/lib/spymemcached-2.11.7.jar:dist/druid/lib/tesla-aether-0.0.5.jar:dist/druid/lib/validation-api-1.1.0.Final.jar:dist/druid/lib/wagon-provider-api-2.4.jar:dist/druid/lib/zookeeper-3.4.8.jar
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=/Users/giaosudau/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=var/tmp
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA>
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.name=Mac OS X
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.arch=x86_64
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:os.version=10.11.6
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.name=giaosudau
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.home=/Users/giaosudau
2016-08-12T03:17:26,593 INFO [main] org.apache.zookeeper.ZooKeeper - Client environment:user.dir=/Users/giaosudau/Downloads/imply-1.3.0
2016-08-12T03:17:26,594 INFO [main] org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=localhost sessionTimeout=30000 watcher=org.apache.curator.ConnectionState@1e60b459
2016-08-12T03:17:26,610 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.discovery.ServerDiscoverySelector.start() throws java.lang.Exception] on object[io.druid.curator.discovery.ServerDiscoverySelector@767599a7].
2016-08-12T03:17:26,614 INFO [main-SendThread(localhost:2181)] org.apache.zookeeper.ClientCnxn - Opening socket connection to server localhost/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error)
2016-08-12T03:17:26,636 INFO [main-SendThread(localhost:2181)] org.apache.zookeeper.ClientCnxn - Socket connection established to localhost/0:0:0:0:0:0:0:1:2181, initiating session
2016-08-12T03:17:26,643 INFO [main-SendThread(localhost:2181)] org.apache.zookeeper.ClientCnxn - Session establishment complete on server localhost/0:0:0:0:0:0:0:1:2181, sessionid = 0x1567cac9a64000b, negotiated timeout = 30000
2016-08-12T03:17:26,646 INFO [main-EventThread] org.apache.curator.framework.state.ConnectionStateManager - State change: CONNECTED
2016-08-12T03:17:26,780 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.announcement.Announcer.start()] on object[io.druid.curator.announcement.Announcer@66de00f2].
2016-08-12T03:17:26,780 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.curator.discovery.ServerDiscoverySelector.start() throws java.lang.Exception] on object[io.druid.curator.discovery.ServerDiscoverySelector@6c479fdf].
2016-08-12T03:17:26,787 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.start() throws java.lang.InterruptedException] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@54089484].
2016-08-12T03:17:26,841 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Running with task: {
"type" : "index_hadoop",
"id" : "index_hadoop_no_metrics_2016-08-12T03:17:23.355Z",
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : null
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : null,
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
},
"hadoopDependencyCoordinates" : null,
"classpathPrefix" : null,
"context" : null,
"groupId" : "index_hadoop_no_metrics_2016-08-12T03:17:23.355Z",
"dataSource" : "no_metrics",
"resource" : {
"availabilityGroup" : "index_hadoop_no_metrics_2016-08-12T03:17:23.355Z",
"requiredCapacity" : 1
}
}
2016-08-12T03:17:26,848 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Attempting to lock file[var/druid/task/index_hadoop_no_metrics_2016-08-12T03:17:23.355Z/lock].
2016-08-12T03:17:26,850 INFO [main] io.druid.indexing.worker.executor.ExecutorLifecycle - Acquired lock file[var/druid/task/index_hadoop_no_metrics_2016-08-12T03:17:23.355Z/lock] in 2ms.
2016-08-12T03:17:26,854 INFO [main] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_hadoop_no_metrics_2016-08-12T03:17:23.355Z]: LockTryAcquireAction{interval=2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z}
2016-08-12T03:17:26,863 INFO [main] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_hadoop_no_metrics_2016-08-12T03:17:23.355Z] to overlord[http://192.168.3.30:8090/druid/indexer/v1/action]: LockTryAcquireAction{interval=2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z}
2016-08-12T03:17:26,878 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,911 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,911 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,912 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,912 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,913 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,913 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,913 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,914 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,914 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,914 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,915 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,915 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,916 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,916 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,916 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,916 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,917 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,917 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,917 INFO [main] com.metamx.http.client.pool.ChannelResourceFactory - Generating: http://192.168.3.30:8090
2016-08-12T03:17:26,970 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Running task: index_hadoop_no_metrics_2016-08-12T03:17:23.355Z
2016-08-12T03:17:26,971 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_hadoop_no_metrics_2016-08-12T03:17:23.355Z] location changed to [TaskLocation{host='192.168.3.30', port=8100}].
2016-08-12T03:17:26,971 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_hadoop_no_metrics_2016-08-12T03:17:23.355Z] status changed to [RUNNING].
2016-08-12T03:17:26,974 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/activation-1.1.jar]
2016-08-12T03:17:26,975 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/avro-1.7.4.jar]
2016-08-12T03:17:26,975 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-beanutils-1.7.0.jar]
2016-08-12T03:17:26,975 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-beanutils-core-1.8.0.jar]
2016-08-12T03:17:26,975 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-cli-1.2.jar]
2016-08-12T03:17:26,975 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-codec-1.4.jar]
2016-08-12T03:17:26,975 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-collections-3.2.1.jar]
2016-08-12T03:17:26,976 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-compress-1.4.1.jar]
2016-08-12T03:17:26,976 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-configuration-1.6.jar]
2016-08-12T03:17:26,976 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-digester-1.8.jar]
2016-08-12T03:17:26,976 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-httpclient-3.1.jar]
2016-08-12T03:17:26,976 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-io-2.4.jar]
2016-08-12T03:17:26,976 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-lang-2.6.jar]
2016-08-12T03:17:26,976 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-logging-1.1.3.jar]
2016-08-12T03:17:26,977 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-math3-3.1.1.jar]
2016-08-12T03:17:26,977 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/commons-net-3.1.jar]
2016-08-12T03:17:26,977 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/guava-11.0.2.jar]
2016-08-12T03:17:26,977 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-annotations-2.3.0.jar]
2016-08-12T03:17:26,977 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-auth-2.3.0.jar]
2016-08-12T03:17:26,977 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-client-2.3.0.jar]
2016-08-12T03:17:26,977 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-common-2.3.0.jar]
2016-08-12T03:17:26,977 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-hdfs-2.3.0.jar]
2016-08-12T03:17:26,977 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-mapreduce-client-app-2.3.0.jar]
2016-08-12T03:17:26,978 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-mapreduce-client-common-2.3.0.jar]
2016-08-12T03:17:26,978 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-mapreduce-client-core-2.3.0.jar]
2016-08-12T03:17:26,978 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-mapreduce-client-jobclient-2.3.0.jar]
2016-08-12T03:17:26,978 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-mapreduce-client-shuffle-2.3.0.jar]
2016-08-12T03:17:26,978 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-yarn-api-2.3.0.jar]
2016-08-12T03:17:26,978 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-yarn-client-2.3.0.jar]
2016-08-12T03:17:26,978 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-yarn-common-2.3.0.jar]
2016-08-12T03:17:26,978 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/hadoop-yarn-server-common-2.3.0.jar]
2016-08-12T03:17:26,979 INFO [main] org.eclipse.jetty.server.Server - jetty-9.2.5.v20141112
2016-08-12T03:17:26,979 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/httpclient-4.2.5.jar]
2016-08-12T03:17:26,979 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/httpcore-4.2.5.jar]
2016-08-12T03:17:26,979 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/jackson-core-asl-1.8.8.jar]
2016-08-12T03:17:26,979 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/jackson-mapper-asl-1.8.8.jar]
2016-08-12T03:17:26,979 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/jaxb-api-2.2.2.jar]
2016-08-12T03:17:26,979 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/jersey-core-1.9.jar]
2016-08-12T03:17:26,979 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/jetty-util-6.1.26.jar]
2016-08-12T03:17:26,980 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/jsr305-1.3.9.jar]
2016-08-12T03:17:26,980 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/log4j-1.2.17.jar]
2016-08-12T03:17:26,980 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/paranamer-2.3.jar]
2016-08-12T03:17:26,980 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/protobuf-java-2.5.0.jar]
2016-08-12T03:17:26,980 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/servlet-api-2.5.jar]
2016-08-12T03:17:26,980 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/slf4j-api-1.7.5.jar]
2016-08-12T03:17:26,980 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/slf4j-log4j12-1.7.5.jar]
2016-08-12T03:17:26,980 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/snappy-java-1.0.4.1.jar]
2016-08-12T03:17:26,980 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/stax-api-1.0-2.jar]
2016-08-12T03:17:26,981 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/xmlenc-0.52.jar]
2016-08-12T03:17:26,981 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/xz-1.0.jar]
2016-08-12T03:17:26,981 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/zookeeper-3.4.5.jar]
2016-08-12T03:17:26,981 INFO [task-runner-0-priority-0] io.druid.indexing.common.task.HadoopTask - Hadoop Container Druid Classpath is set to [file:/Users/giaosudau/Downloads/imply-1.3.0/conf-quickstart/druid/_common/:file:/Users/giaosudau/Downloads/imply-1.3.0/conf-quickstart/druid/middleManager/:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/activation-1.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aether-api-0.9.0.M2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aether-connector-file-0.9.0.M2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aether-connector-okhttp-0.0.9.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aether-impl-0.9.0.M2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aether-spi-0.9.0.M2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aether-util-0.9.0.M2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/airline-0.7.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/annotations-2.0.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/antlr4-runtime-4.5.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aopalliance-1.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-autoscaling-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-cloudformation-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-cloudfront-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-cloudhsm-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-cloudsearch-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-cloudtrail-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-cloudwatch-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-cloudwatchmetrics-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-codecommit-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-codedeploy-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-codepipeline-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-cognitoidentity-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-cognitosync-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-config-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-core-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-datapipeline-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-devicefarm-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-directconnect-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-directory-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-dynamodb-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-ec2-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-ecs-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-efs-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-elasticache-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-elasticbeanstalk-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-elasticloadbalancing-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-elastictranscoder-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-emr-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-glacier-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-iam-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-importexport-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-kinesis-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-kms-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-lambda-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-logs-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-machinelearning-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-opsworks-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-rds-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-redshift-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-route53-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-s3-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-ses-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-simpledb-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-simpleworkflow-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-sns-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-sqs-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-ssm-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-storagegateway-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-sts-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-support-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-swf-libraries-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/aws-java-sdk-workspaces-1.10.21.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/base64-2.3.8.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/bcprov-jdk15on-1.52.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/bytebuffer-collections-0.2.4.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/classmate-1.0.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/commons-cli-1.2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/commons-codec-1.7.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/commons-dbcp2-2.0.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/commons-io-2.4.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/commons-lang-2.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/commons-logging-1.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/commons-math3-3.6.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/commons-pool-1.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/commons-pool2-2.2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/compress-lzf-1.0.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/config-magic-0.9.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/curator-client-2.10.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/curator-framework-2.10.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/curator-recipes-2.10.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/curator-x-discovery-2.10.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/derby-10.11.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/derbyclient-10.11.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/derbynet-10.11.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/derbytools-10.11.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/disruptor-3.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/druid-api-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/druid-aws-common-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/druid-common-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/druid-console-0.0.2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/druid-indexing-hadoop-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/druid-indexing-service-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/druid-processing-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/druid-server-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/druid-services-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/emitter-0.3.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/extendedset-1.3.9.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/geoip2-0.4.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/google-http-client-jackson2-1.15.0-rc.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/guava-16.0.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/guice-4.0-beta.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/guice-multibindings-4.0-beta.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/guice-servlet-4.0-beta.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/hibernate-validator-5.1.3.Final.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/http-client-1.0.4.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/httpclient-4.5.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/httpcore-4.4.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/icu4j-4.8.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/irc-api-1.0-0014.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-annotations-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-core-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-core-asl-1.9.13.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-databind-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-dataformat-smile-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-datatype-guava-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-datatype-joda-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-jaxrs-base-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-jaxrs-json-provider-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-jaxrs-smile-provider-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-mapper-asl-1.9.13.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jackson-module-jaxb-annotations-2.4.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/java-util-0.27.9.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/java-xmlbuilder-1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/javax.el-3.0.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/javax.el-api-3.0.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/javax.inject-1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/javax.servlet-api-3.1.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jboss-logging-3.1.3.GA.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jcl-over-slf4j-1.7.12.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jdbi-2.63.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jersey-core-1.19.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jersey-guice-1.19.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jersey-server-1.19.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jersey-servlet-1.19.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jets3t-0.9.4.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-client-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-continuation-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-http-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-io-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-proxy-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-security-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-server-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-servlet-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-servlets-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jetty-util-9.2.5.v20141112.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jline-0.9.94.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/joda-time-2.8.2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/json-path-2.1.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jsr305-2.0.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/jsr311-api-1.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/log4j-1.2-api-2.5.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/log4j-api-2.5.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/log4j-core-2.5.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/log4j-jul-2.5.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/log4j-slf4j-impl-2.5.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/lz4-1.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/mapdb-1.0.8.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/maven-aether-provider-3.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/maven-model-3.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/maven-model-builder-3.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/maven-repository-metadata-3.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/maven-settings-3.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/maven-settings-builder-3.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/maxminddb-0.2.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/netty-3.10.4.Final.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/okhttp-1.0.2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/opencsv-2.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/plexus-interpolation-1.19.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/plexus-utils-3.0.15.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/protobuf-java-2.5.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/rhino-1.7R5.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/RoaringBitmap-0.5.16.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/server-metrics-0.2.8.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/slf4j-api-1.7.12.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/spymemcached-2.11.7.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/tesla-aether-0.0.5.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/validation-api-1.1.0.Final.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/wagon-provider-api-2.4.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/zookeeper-3.4.8.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-datasketches/druid-datasketches-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-datasketches/sketches-core-0.2.2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-1.7.7.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-ipc-1.7.7-tests.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-ipc-1.7.7.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-mapred-1.7.7-hadoop2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/commons-collections-3.2.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/commons-compress-1.4.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/commons-lang-2.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/druid-avro-extensions-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/gson-2.3.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/guava-16.0.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jackson-core-asl-1.9.13.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jackson-mapper-asl-1.9.13.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jersey-client-1.15.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jersey-core-1.19.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jetty-6.1.26.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jetty-util-6.1.26.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jsr311-api-1.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/netty-3.4.0.Final.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/paranamer-2.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-api-0.1.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-avro-0.1.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-client-0.1.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-common-0.1.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/servlet-api-2.5-20081211.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/slf4j-api-1.6.4.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/snappy-java-1.0.5.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/velocity-1.7.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/xz-1.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/druid-parquet-extensions-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-avro-1.8.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-column-1.8.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-common-1.8.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-encoding-1.8.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-format-2.3.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-hadoop-1.8.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-jackson-1.8.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-tools-1.8.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/postgresql-metadata-storage/postgresql-9.4.1208.jre7.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/postgresql-metadata-storage/postgresql-metadata-storage-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/avro-1.7.4.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-beanutils-1.7.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-beanutils-core-1.8.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-collections-3.2.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-compress-1.4.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-configuration-1.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-digester-1.8.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-math3-3.6.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-net-3.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/druid-hdfs-storage-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/guava-16.0.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-auth-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-client-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-common-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-hdfs-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-app-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-common-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-core-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-jobclient-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-shuffle-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-api-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-client-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-common-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-server-common-2.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/jaxb-api-2.2.2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/paranamer-2.3.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/servlet-api-2.5.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/snappy-java-1.0.4.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/stax-api-1.0-2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/xmlenc-0.52.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/xz-1.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-histogram/druid-histogram-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-datasketches/druid-datasketches-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-datasketches/sketches-core-0.2.2.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/druid-kafka-indexing-service-0.9.1.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/kafka-clients-0.9.0.1.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/lz4-1.3.0.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/slf4j-api-1.7.6.jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/snappy-java-1.1.1.7.jar]
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/lib/log4j-slf4j-impl-2.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/hadoop-dependencies/hadoop-client/2.3.0/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
2016-08-12T03:17:27,113 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider as a provider class
2016-08-12T03:17:27,113 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider as a provider class
2016-08-12T03:17:27,113 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering io.druid.server.initialization.jetty.CustomExceptionMapper as a provider class
2016-08-12T03:17:27,113 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Registering io.druid.server.StatusResource as a root resource class
2016-08-12T03:17:27,116 INFO [main] com.sun.jersey.server.impl.application.WebApplicationImpl - Initiating Jersey application, version 'Jersey: 1.19 02/11/2015 03:25 AM'
2016-08-12T03:17:27,199 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.initialization.jetty.CustomExceptionMapper to GuiceManagedComponentProvider with the scope "Singleton"
2016-08-12T03:17:27,202 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.json.JacksonJsonProvider to GuiceManagedComponentProvider with the scope "Singleton"
2016-08-12T03:17:27,209 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding com.fasterxml.jackson.jaxrs.smile.JacksonSmileProvider to GuiceManagedComponentProvider with the scope "Singleton"
2016-08-12 03:17:27,359 task-runner-0-priority-0 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup
at java.lang.Class.cast(Class.java:3369)
at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167)
at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301)
at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:94)
at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:111)
at org.apache.logging.log4j.core.config.DefaultConfiguration.<init>(DefaultConfiguration.java:48)
at org.apache.logging.log4j.core.LoggerContext.<init>(LoggerContext.java:75)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.createContext(ClassLoaderContextSelector.java:171)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.locateContext(ClassLoaderContextSelector.java:145)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:70)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:57)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:140)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
at java.util.logging.LogManager.demandLogger(LogManager.java:551)
at java.util.logging.Logger.demandLogger(Logger.java:455)
at java.util.logging.Logger.getLogger(Logger.java:502)
at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.inject.Guice.createInjector(Guice.java:73)
at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:59)
at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:99)
at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201)
at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:175)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2016-08-12 03:17:27,363 task-runner-0-priority-0 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup
at java.lang.Class.cast(Class.java:3369)
at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167)
at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301)
at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:105)
at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:111)
at org.apache.logging.log4j.core.config.DefaultConfiguration.<init>(DefaultConfiguration.java:48)
at org.apache.logging.log4j.core.LoggerContext.<init>(LoggerContext.java:75)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.createContext(ClassLoaderContextSelector.java:171)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.locateContext(ClassLoaderContextSelector.java:145)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:70)
at org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:57)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:140)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
at java.util.logging.LogManager.demandLogger(LogManager.java:551)
at java.util.logging.Logger.demandLogger(Logger.java:455)
at java.util.logging.Logger.getLogger(Logger.java:502)
at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.inject.Guice.createInjector(Guice.java:73)
at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:59)
at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:99)
at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201)
at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:175)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2016-08-12 03:17:27,396 task-runner-0-priority-0 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup
at java.lang.Class.cast(Class.java:3369)
at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167)
at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301)
at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:94)
at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:111)
at org.apache.logging.log4j.core.config.xml.XmlConfiguration.<init>(XmlConfiguration.java:81)
at org.apache.logging.log4j.core.config.xml.XmlConfigurationFactory.getConfiguration(XmlConfigurationFactory.java:44)
at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:490)
at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:460)
at org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:256)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:561)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:578)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:214)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
at java.util.logging.LogManager.demandLogger(LogManager.java:551)
at java.util.logging.Logger.demandLogger(Logger.java:455)
at java.util.logging.Logger.getLogger(Logger.java:502)
at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.inject.Guice.createInjector(Guice.java:73)
at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:59)
at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:99)
at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201)
at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:175)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2016-08-12 03:17:27,397 task-runner-0-priority-0 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup
at java.lang.Class.cast(Class.java:3369)
at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167)
at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301)
at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:105)
at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:111)
at org.apache.logging.log4j.core.config.xml.XmlConfiguration.<init>(XmlConfiguration.java:81)
at org.apache.logging.log4j.core.config.xml.XmlConfigurationFactory.getConfiguration(XmlConfigurationFactory.java:44)
at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:490)
at org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:460)
at org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:256)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:561)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:578)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:214)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
at java.util.logging.LogManager.demandLogger(LogManager.java:551)
at java.util.logging.Logger.demandLogger(Logger.java:455)
at java.util.logging.Logger.getLogger(Logger.java:502)
at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.inject.Guice.createInjector(Guice.java:73)
at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:59)
at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:99)
at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201)
at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:175)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2016-08-12 03:17:27,404 task-runner-0-priority-0 WARN JNDI lookup class is not available because this JRE does not support JNDI. JNDI string lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JndiLookup to org.apache.logging.log4j.core.lookup.StrLookup
at java.lang.Class.cast(Class.java:3369)
at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167)
at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301)
at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:94)
at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:111)
at org.apache.logging.log4j.core.config.DefaultConfiguration.<init>(DefaultConfiguration.java:48)
at org.apache.logging.log4j.core.layout.PatternLayout$Builder.build(PatternLayout.java:444)
at org.apache.logging.log4j.core.layout.PatternLayout.createDefaultLayout(PatternLayout.java:327)
at org.apache.logging.log4j.core.appender.ConsoleAppender$Builder.<init>(ConsoleAppender.java:157)
at org.apache.logging.log4j.core.appender.ConsoleAppender.newBuilder(ConsoleAppender.java:149)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.createBuilder(PluginBuilder.java:154)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:119)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:888)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:828)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:820)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:449)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:197)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:209)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:492)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:562)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:578)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:214)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
at java.util.logging.LogManager.demandLogger(LogManager.java:551)
at java.util.logging.Logger.demandLogger(Logger.java:455)
at java.util.logging.Logger.getLogger(Logger.java:502)
at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.inject.Guice.createInjector(Guice.java:73)
at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:59)
at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:99)
at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201)
at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:175)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2016-08-12 03:17:27,406 task-runner-0-priority-0 WARN JMX runtime input lookup class is not available because this JRE does not support JMX. JMX lookups will not be available, continuing configuration. java.lang.ClassCastException: Cannot cast org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup to org.apache.logging.log4j.core.lookup.StrLookup
at java.lang.Class.cast(Class.java:3369)
at org.apache.logging.log4j.util.LoaderUtil.newCheckedInstanceOf(LoaderUtil.java:167)
at org.apache.logging.log4j.core.util.Loader.newCheckedInstanceOf(Loader.java:301)
at org.apache.logging.log4j.core.lookup.Interpolator.<init>(Interpolator.java:105)
at org.apache.logging.log4j.core.config.AbstractConfiguration.<init>(AbstractConfiguration.java:111)
at org.apache.logging.log4j.core.config.DefaultConfiguration.<init>(DefaultConfiguration.java:48)
at org.apache.logging.log4j.core.layout.PatternLayout$Builder.build(PatternLayout.java:444)
at org.apache.logging.log4j.core.layout.PatternLayout.createDefaultLayout(PatternLayout.java:327)
at org.apache.logging.log4j.core.appender.ConsoleAppender$Builder.<init>(ConsoleAppender.java:157)
at org.apache.logging.log4j.core.appender.ConsoleAppender.newBuilder(ConsoleAppender.java:149)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.createBuilder(PluginBuilder.java:154)
at org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:119)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:888)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:828)
at org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:820)
at org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:449)
at org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:197)
at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:209)
at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:492)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:562)
at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:578)
at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:214)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:145)
at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:41)
at org.apache.logging.log4j.LogManager.getContext(LogManager.java:182)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:103)
at org.apache.logging.log4j.jul.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:34)
at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:42)
at org.apache.logging.log4j.jul.LogManager.getLogger(LogManager.java:89)
at java.util.logging.LogManager.demandLogger(LogManager.java:551)
at java.util.logging.Logger.demandLogger(Logger.java:455)
at java.util.logging.Logger.getLogger(Logger.java:502)
at com.google.inject.internal.util.Stopwatch.<clinit>(Stopwatch.java:27)
at com.google.inject.internal.InternalInjectorCreator.<init>(InternalInjectorCreator.java:61)
at com.google.inject.Guice.createInjector(Guice.java:96)
at com.google.inject.Guice.createInjector(Guice.java:73)
at io.druid.guice.GuiceInjectors.makeStartupInjector(GuiceInjectors.java:59)
at io.druid.indexer.HadoopDruidIndexerConfig.<clinit>(HadoopDruidIndexerConfig.java:99)
at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:277)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201)
at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:175)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436)
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
2016-08-12T03:17:27,532 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.QueryResource to GuiceInstantiatedComponentProvider
2016-08-12T03:17:27,539 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.segment.realtime.firehose.ChatHandlerResource to GuiceInstantiatedComponentProvider
2016-08-12T03:17:27,544 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.query.lookup.LookupListeningResource to GuiceInstantiatedComponentProvider
2016-08-12T03:17:27,546 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.query.lookup.LookupIntrospectionResource to GuiceInstantiatedComponentProvider
2016-08-12T03:17:27,547 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.http.security.StateResourceFilter to GuiceInstantiatedComponentProvider
2016-08-12T03:17:27,549 INFO [main] com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory - Binding io.druid.server.StatusResource to GuiceManagedComponentProvider with the scope "Undefined"
2016-08-12T03:17:27,556 INFO [task-runner-0-priority-0] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2016-08-12T03:17:27,558 INFO [task-runner-0-priority-0] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2016-08-12T03:17:27,571 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler@15c6027d{/,null,AVAILABLE}
2016-08-12T03:17:27,581 INFO [main] org.eclipse.jetty.server.ServerConnector - Started ServerConnector@794eeaf8{HTTP/1.1}{0.0.0.0:8100}
2016-08-12T03:17:27,582 INFO [main] org.eclipse.jetty.server.Server - Started @4178ms
2016-08-12T03:17:27,582 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.query.lookup.LookupReferencesManager.start()] on object[io.druid.query.lookup.LookupReferencesManager@37c5284a].
2016-08-12T03:17:27,582 INFO [main] io.druid.query.lookup.LookupReferencesManager - Started lookup factory references manager
2016-08-12T03:17:27,582 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.listener.announcer.ListenerResourceAnnouncer.start()] on object[io.druid.query.lookup.LookupResourceListenerAnnouncer@5b8572df].
2016-08-12T03:17:27,592 INFO [task-runner-0-priority-0] org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 5.1.3.Final
2016-08-12T03:17:27,595 INFO [main] io.druid.server.listener.announcer.ListenerResourceAnnouncer - Announcing start time on [/druid/listeners/lookups/__default/192.168.3.30:8100]
2016-08-12T03:17:27,595 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking start method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.start()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@421d54b3].
2016-08-12T03:17:27,596 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Announcing self[DruidServerMetadata{name='192.168.3.30:8100', host='192.168.3.30:8100', maxSize=0, tier='_default_tier', type='indexer-executor', priority='0'}] at [/druid/announcements/192.168.3.30:8100]
2016-08-12T03:17:28,119 INFO [task-runner-0-priority-0] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, directory='dist/druid/extensions', hadoopDependenciesDir='dist/druid/hadoop-dependencies', hadoopContainerDruidClasspath='null', loadList=[druid-datasketches, druid-avro-extensions, druid-parquet-extensions, postgresql-metadata-storage, druid-hdfs-storage, druid-histogram, druid-datasketches, druid-kafka-indexing-service]}]
2016-08-12T03:17:28,129 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Adding classpath extension module [io.druid.query.aggregation.datasketches.theta.SketchModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,129 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Adding classpath extension module [io.druid.query.aggregation.datasketches.theta.oldapi.OldApiSketchModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,131 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Adding classpath extension module [io.druid.data.input.avro.AvroExtensionsModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,132 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Adding classpath extension module [io.druid.data.input.parquet.ParquetExtensionsModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,133 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Adding classpath extension module [io.druid.metadata.storage.postgresql.PostgreSQLMetadataStorageModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,135 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Adding classpath extension module [io.druid.storage.hdfs.HdfsStorageDruidModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,137 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Adding classpath extension module [io.druid.query.aggregation.histogram.ApproximateHistogramDruidModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,138 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Adding classpath extension module [io.druid.indexing.kafka.KafkaIndexTaskModule] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,139 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Loading extension [druid-datasketches] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,152 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-datasketches/druid-datasketches-0.9.1.1.jar]
2016-08-12T03:17:28,152 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-datasketches/sketches-core-0.2.2.jar]
2016-08-12T03:17:28,156 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Loading extension [druid-avro-extensions] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,158 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-1.7.7.jar]
2016-08-12T03:17:28,158 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-ipc-1.7.7-tests.jar]
2016-08-12T03:17:28,158 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-ipc-1.7.7.jar]
2016-08-12T03:17:28,158 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/avro-mapred-1.7.7-hadoop2.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/commons-collections-3.2.1.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/commons-compress-1.4.1.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/commons-lang-2.6.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/druid-avro-extensions-0.9.1.1.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/gson-2.3.1.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/guava-16.0.1.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jackson-core-asl-1.9.13.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jackson-mapper-asl-1.9.13.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jersey-client-1.15.jar]
2016-08-12T03:17:28,159 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jersey-core-1.19.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jetty-6.1.26.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jetty-util-6.1.26.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/jsr311-api-1.1.1.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/netty-3.4.0.Final.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/paranamer-2.3.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-api-0.1.3.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-avro-0.1.3.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-client-0.1.3.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/schema-repo-common-0.1.3.jar]
2016-08-12T03:17:28,160 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/servlet-api-2.5-20081211.jar]
2016-08-12T03:17:28,161 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/slf4j-api-1.6.4.jar]
2016-08-12T03:17:28,161 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/snappy-java-1.0.5.jar]
2016-08-12T03:17:28,161 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/velocity-1.7.jar]
2016-08-12T03:17:28,161 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-avro-extensions/xz-1.0.jar]
2016-08-12T03:17:28,164 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Loading extension [druid-parquet-extensions] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,165 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/druid-parquet-extensions-0.9.1.1.jar]
2016-08-12T03:17:28,165 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-avro-1.8.1.jar]
2016-08-12T03:17:28,165 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-column-1.8.1.jar]
2016-08-12T03:17:28,165 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-common-1.8.1.jar]
2016-08-12T03:17:28,165 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-encoding-1.8.1.jar]
2016-08-12T03:17:28,165 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-format-2.3.1.jar]
2016-08-12T03:17:28,165 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-hadoop-1.8.1.jar]
2016-08-12T03:17:28,165 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-jackson-1.8.1.jar]
2016-08-12T03:17:28,165 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-parquet-extensions/parquet-tools-1.8.1.jar]
2016-08-12T03:17:28,168 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Loading extension [postgresql-metadata-storage] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,168 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/postgresql-metadata-storage/postgresql-9.4.1208.jre7.jar]
2016-08-12T03:17:28,168 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/postgresql-metadata-storage/postgresql-metadata-storage-0.9.1.1.jar]
2016-08-12T03:17:28,170 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Loading extension [druid-hdfs-storage] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,170 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/avro-1.7.4.jar]
2016-08-12T03:17:28,170 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-beanutils-1.7.0.jar]
2016-08-12T03:17:28,170 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-beanutils-core-1.8.0.jar]
2016-08-12T03:17:28,171 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-collections-3.2.1.jar]
2016-08-12T03:17:28,171 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-compress-1.4.1.jar]
2016-08-12T03:17:28,171 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-configuration-1.6.jar]
2016-08-12T03:17:28,171 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-digester-1.8.jar]
2016-08-12T03:17:28,171 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-math3-3.6.1.jar]
2016-08-12T03:17:28,171 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/commons-net-3.1.jar]
2016-08-12T03:17:28,171 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/druid-hdfs-storage-0.9.1.1.jar]
2016-08-12T03:17:28,171 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/guava-16.0.1.jar]
2016-08-12T03:17:28,172 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-auth-2.3.0.jar]
2016-08-12T03:17:28,172 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-client-2.3.0.jar]
2016-08-12T03:17:28,172 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-common-2.3.0.jar]
2016-08-12T03:17:28,172 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-hdfs-2.3.0.jar]
2016-08-12T03:17:28,172 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-app-2.3.0.jar]
2016-08-12T03:17:28,172 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-common-2.3.0.jar]
2016-08-12T03:17:28,172 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-core-2.3.0.jar]
2016-08-12T03:17:28,172 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-jobclient-2.3.0.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-mapreduce-client-shuffle-2.3.0.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-api-2.3.0.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-client-2.3.0.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-common-2.3.0.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/hadoop-yarn-server-common-2.3.0.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/jaxb-api-2.2.2.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/paranamer-2.3.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/servlet-api-2.5.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/snappy-java-1.0.4.1.jar]
2016-08-12T03:17:28,173 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/stax-api-1.0-2.jar]
2016-08-12T03:17:28,174 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/xmlenc-0.52.jar]
2016-08-12T03:17:28,174 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-hdfs-storage/xz-1.0.jar]
2016-08-12T03:17:28,178 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Loading extension [druid-histogram] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,178 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-histogram/druid-histogram-0.9.1.1.jar]
2016-08-12T03:17:28,180 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Loading extension [druid-datasketches] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,181 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - Loading extension [druid-kafka-indexing-service] for class [io.druid.initialization.DruidModule]
2016-08-12T03:17:28,181 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/druid-kafka-indexing-service-0.9.1.1.jar]
2016-08-12T03:17:28,181 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/kafka-clients-0.9.0.1.jar]
2016-08-12T03:17:28,181 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/lz4-1.3.0.jar]
2016-08-12T03:17:28,182 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/slf4j-api-1.7.6.jar]
2016-08-12T03:17:28,182 INFO [task-runner-0-priority-0] io.druid.initialization.Initialization - added URL[file:/Users/giaosudau/Downloads/imply-1.3.0/dist/druid/extensions/druid-kafka-indexing-service/snappy-java-1.1.1.7.jar]
2016-08-12T03:17:28,505 WARN [task-runner-0-priority-0] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-08-12T03:17:28,859 INFO [task-runner-0-priority-0] io.druid.guice.JsonConfigurator - Loaded class[class com.metamx.emitter.core.LoggingEmitterConfig] from props[druid.emitter.logging.] as [LoggingEmitterConfig{loggerClass='com.metamx.emitter.core.LoggingEmitter', logLevel='info'}]
2016-08-12T03:17:28,913 INFO [task-runner-0-priority-0] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.DruidMonitorSchedulerConfig] from props[druid.monitoring.] as [io.druid.server.metrics.DruidMonitorSchedulerConfig@506e871f]
2016-08-12T03:17:28,920 INFO [task-runner-0-priority-0] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.metrics.MonitorsConfig] from props[druid.monitoring.] as [MonitorsConfig{monitors=[class com.metamx.metrics.JvmMonitor]}]
2016-08-12T03:17:28,928 INFO [task-runner-0-priority-0] io.druid.server.metrics.MetricsModule - Adding monitor[com.metamx.metrics.JvmMonitor@2623a69e]
2016-08-12T03:17:28,928 INFO [task-runner-0-priority-0] io.druid.server.metrics.MetricsModule - Adding monitor[io.druid.query.ExecutorServiceMonitor@2557ac4e]
2016-08-12T03:17:28,928 INFO [task-runner-0-priority-0] io.druid.server.metrics.MetricsModule - Adding monitor[io.druid.server.initialization.jetty.JettyServerModule$JettyMonitor@210aa08d]
2016-08-12T03:17:28,932 INFO [task-runner-0-priority-0] org.skife.config.ConfigurationObjectFactory - Assigning value [256000000] for [druid.processing.buffer.sizeBytes] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2016-08-12T03:17:28,934 INFO [task-runner-0-priority-0] org.skife.config.ConfigurationObjectFactory - Assigning value [2] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()]
2016-08-12T03:17:28,934 INFO [task-runner-0-priority-0] org.skife.config.ConfigurationObjectFactory - Using method itself for [druid.computation.buffer.poolCacheMaxCount, ${base_path}.buffer.poolCacheMaxCount] on [io.druid.query.DruidProcessingConfig#poolCacheMaxCount()]
2016-08-12T03:17:28,935 INFO [task-runner-0-priority-0] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.fifo] on [io.druid.query.DruidProcessingConfig#isFifo()]
2016-08-12T03:17:28,935 INFO [task-runner-0-priority-0] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2016-08-12T03:17:28,935 INFO [task-runner-0-priority-0] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2016-08-12T03:17:29,320 INFO [task-runner-0-priority-0] io.druid.guice.PropertiesModule - Loading properties from common.runtime.properties
2016-08-12T03:17:29,321 INFO [task-runner-0-priority-0] io.druid.guice.PropertiesModule - Loading properties from runtime.properties
2016-08-12T03:17:29,330 INFO [task-runner-0-priority-0] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.guice.ExtensionsConfig] from props[druid.extensions.] as [ExtensionsConfig{searchCurrentClassloader=true, directory='dist/druid/extensions', hadoopDependenciesDir='dist/druid/hadoop-dependencies', hadoopContainerDruidClasspath='null', loadList=[druid-datasketches, druid-avro-extensions, druid-parquet-extensions, postgresql-metadata-storage, druid-hdfs-storage, druid-histogram, druid-datasketches, druid-kafka-indexing-service]}]
2016-08-12T03:17:29,331 INFO [task-runner-0-priority-0] io.druid.indexing.common.task.HadoopIndexTask - Starting a hadoop determine configuration job...
2016-08-12T03:17:29,355 INFO [task-runner-0-priority-0] io.druid.indexer.path.StaticPathSpec - Adding paths[/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet]
2016-08-12T03:17:29,377 INFO [task-runner-0-priority-0] io.druid.indexer.path.StaticPathSpec - Adding paths[/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet]
2016-08-12T03:17:29,432 INFO [task-runner-0-priority-0] org.apache.hadoop.conf.Configuration.deprecation - session.id is deprecated. Instead, use dfs.metrics.session-id
2016-08-12T03:17:29,433 INFO [task-runner-0-priority-0] org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
2016-08-12T03:17:29,538 WARN [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobSubmitter - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
2016-08-12T03:17:29,546 WARN [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobSubmitter - No job jar file set. User classes may not be found. See Job or Job#setJar(String).
2016-08-12T03:17:29,568 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 1
2016-08-12T03:17:29,606 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobSubmitter - number of splits:1
2016-08-12T03:17:29,682 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.JobSubmitter - Submitting tokens for job: job_local895013469_0001
2016-08-12T03:17:29,704 WARN [task-runner-0-priority-0] org.apache.hadoop.conf.Configuration - file:/Users/giaosudau/Downloads/imply-1.3.0/var/hadoop-tmp/mapred/staging/giaosudau895013469/.staging/job_local895013469_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
2016-08-12T03:17:29,704 WARN [task-runner-0-priority-0] org.apache.hadoop.conf.Configuration - file:/Users/giaosudau/Downloads/imply-1.3.0/var/hadoop-tmp/mapred/staging/giaosudau895013469/.staging/job_local895013469_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
2016-08-12T03:17:29,787 WARN [task-runner-0-priority-0] org.apache.hadoop.conf.Configuration - file:/Users/giaosudau/Downloads/imply-1.3.0/var/hadoop-tmp/mapred/local/localRunner/giaosudau/job_local895013469_0001/job_local895013469_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
2016-08-12T03:17:29,787 WARN [task-runner-0-priority-0] org.apache.hadoop.conf.Configuration - file:/Users/giaosudau/Downloads/imply-1.3.0/var/hadoop-tmp/mapred/local/localRunner/giaosudau/job_local895013469_0001/job_local895013469_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
2016-08-12T03:17:29,793 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - The url to track the job: http://localhost:8080/
2016-08-12T03:17:29,793 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Job no_metrics-determine_partitions_hashed-Optional.of([2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z]) submitted, status available at: http://localhost:8080/
2016-08-12T03:17:29,794 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Running job: job_local895013469_0001
2016-08-12T03:17:29,796 INFO [Thread-53] org.apache.hadoop.mapred.LocalJobRunner - OutputCommitter set in config null
2016-08-12T03:17:29,804 INFO [Thread-53] org.apache.hadoop.mapred.LocalJobRunner - OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2016-08-12T03:17:29,853 INFO [Thread-53] org.apache.hadoop.mapred.LocalJobRunner - Waiting for map tasks
2016-08-12T03:17:29,854 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:29,886 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:29,887 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:29,889 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Processing split: ParquetInputSplit{part: file:/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet start: 0 end: 5909 length: 5909 hosts: []}
2016-08-12T03:17:29,904 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
2016-08-12T03:17:29,916 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - (EQUATOR) 0 kvi 26214396(104857584)
2016-08-12T03:17:29,916 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - mapreduce.task.io.sort.mb: 100
2016-08-12T03:17:29,916 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - soft limit at 83886080
2016-08-12T03:17:29,917 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - bufstart = 0; bufvoid = 104857600
2016-08-12T03:17:29,917 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - kvstart = 26214396; length = 6553600
2016-08-12T03:17:29,938 INFO [LocalJobRunner Map Task Executor #0] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:30,204 INFO [LocalJobRunner Map Task Executor #0] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:30,249 INFO [LocalJobRunner Map Task Executor #0] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:30,253 INFO [LocalJobRunner Map Task Executor #0] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:30,321 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.io.compress.CodecPool - Got brand-new decompressor [.snappy]
2016-08-12T03:17:30,700 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.conf.Configuration.deprecation - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2016-08-12T03:17:30,703 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner -
2016-08-12T03:17:30,704 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Starting flush of map output
2016-08-12T03:17:30,704 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Spilling map output
2016-08-12T03:17:30,704 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - bufstart = 0; bufend = 1064; bufvoid = 104857600
2016-08-12T03:17:30,704 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - kvstart = 26214396(104857584); kvend = 26214176(104856704); length = 221/6553600
2016-08-12T03:17:30,718 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Finished spill 0
2016-08-12T03:17:30,721 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_m_000000_0 is done. And is in the process of committing
2016-08-12T03:17:30,728 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner - map
2016-08-12T03:17:30,728 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_m_000000_0' done.
2016-08-12T03:17:30,729 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:30,729 INFO [Thread-53] org.apache.hadoop.mapred.LocalJobRunner - map task executor complete.
2016-08-12T03:17:30,736 INFO [Thread-53] org.apache.hadoop.mapred.LocalJobRunner - Waiting for reduce tasks
2016-08-12T03:17:30,736 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000000_0
2016-08-12T03:17:30,743 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:30,743 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:30,745 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@7f72e1e5
2016-08-12T03:17:30,757 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:30,761 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000000_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:30,788 INFO [localfetcher#1] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#1 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 1178 len: 1182 to MEMORY
2016-08-12T03:17:30,791 INFO [localfetcher#1] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 1178 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:30,791 INFO [localfetcher#1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 1178, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->1178
2016-08-12T03:17:30,792 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:30,793 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:30,793 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:30,797 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Job job_local895013469_0001 running in uber mode : false
2016-08-12T03:17:30,799 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - map 100% reduce 0%
2016-08-12T03:17:30,800 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:30,801 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 1 segments left of total size: 1168 bytes
2016-08-12T03:17:30,803 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 1178 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:30,804 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 1182 bytes from disk
2016-08-12T03:17:30,804 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:30,804 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:30,804 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 1 segments left of total size: 1168 bytes
2016-08-12T03:17:30,805 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:30,827 INFO [pool-19-thread-1] org.apache.hadoop.conf.Configuration.deprecation - mapred.skip.on is deprecated. Instead, use mapreduce.job.skiprecords
2016-08-12T03:17:30,835 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,262 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000000_0 is done. And is in the process of committing
2016-08-12T03:17:31,263 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,263 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000000_0 is allowed to commit now
2016-08-12T03:17:31,265 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000000_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000000
2016-08-12T03:17:31,266 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,266 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000000_0' done.
2016-08-12T03:17:31,266 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000000_0
2016-08-12T03:17:31,266 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000001_0
2016-08-12T03:17:31,267 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,267 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,267 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@59a9e48f
2016-08-12T03:17:31,268 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,268 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000001_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,270 INFO [localfetcher#2] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#2 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,270 INFO [localfetcher#2] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,270 INFO [localfetcher#2] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,271 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,271 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,271 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,272 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,273 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,273 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,274 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,274 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,274 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,275 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,276 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,285 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,286 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000001_0 is done. And is in the process of committing
2016-08-12T03:17:31,287 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,287 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000001_0 is allowed to commit now
2016-08-12T03:17:31,288 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000001_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000001
2016-08-12T03:17:31,288 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,289 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000001_0' done.
2016-08-12T03:17:31,289 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000001_0
2016-08-12T03:17:31,289 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000002_0
2016-08-12T03:17:31,290 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,290 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,290 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@591a23ad
2016-08-12T03:17:31,291 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,291 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000002_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,293 INFO [localfetcher#3] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#3 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,293 INFO [localfetcher#3] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,293 INFO [localfetcher#3] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,294 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,294 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,294 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,295 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,296 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,297 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,297 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,297 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,297 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,297 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,297 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,306 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,308 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000002_0 is done. And is in the process of committing
2016-08-12T03:17:31,309 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,309 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000002_0 is allowed to commit now
2016-08-12T03:17:31,309 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000002_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000002
2016-08-12T03:17:31,310 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,310 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000002_0' done.
2016-08-12T03:17:31,310 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000002_0
2016-08-12T03:17:31,310 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000003_0
2016-08-12T03:17:31,311 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,311 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,311 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@12d944b8
2016-08-12T03:17:31,312 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,312 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000003_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,313 INFO [localfetcher#4] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#4 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,314 INFO [localfetcher#4] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,314 INFO [localfetcher#4] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,314 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,315 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,315 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,316 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,316 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,316 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,317 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,317 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,317 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,317 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,317 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,328 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,330 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000003_0 is done. And is in the process of committing
2016-08-12T03:17:31,330 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,330 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000003_0 is allowed to commit now
2016-08-12T03:17:31,331 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000003_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000003
2016-08-12T03:17:31,332 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,332 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000003_0' done.
2016-08-12T03:17:31,332 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000003_0
2016-08-12T03:17:31,332 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000004_0
2016-08-12T03:17:31,333 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,333 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,333 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@6097e6ad
2016-08-12T03:17:31,333 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,334 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000004_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,335 INFO [localfetcher#5] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#5 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,336 INFO [localfetcher#5] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,336 INFO [localfetcher#5] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,336 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,337 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,337 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,338 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,338 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,339 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,339 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,339 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,339 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,339 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,339 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,347 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,348 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000004_0 is done. And is in the process of committing
2016-08-12T03:17:31,348 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,348 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000004_0 is allowed to commit now
2016-08-12T03:17:31,349 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000004_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000004
2016-08-12T03:17:31,349 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,349 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000004_0' done.
2016-08-12T03:17:31,349 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000004_0
2016-08-12T03:17:31,349 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000005_0
2016-08-12T03:17:31,351 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,351 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,351 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@61d8c8b
2016-08-12T03:17:31,351 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,352 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000005_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,354 INFO [localfetcher#6] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#6 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,354 INFO [localfetcher#6] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,354 INFO [localfetcher#6] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,355 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,355 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,355 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,356 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,357 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,357 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,357 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,357 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,357 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,358 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,358 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,366 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,367 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000005_0 is done. And is in the process of committing
2016-08-12T03:17:31,368 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,368 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000005_0 is allowed to commit now
2016-08-12T03:17:31,368 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000005_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000005
2016-08-12T03:17:31,369 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,369 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000005_0' done.
2016-08-12T03:17:31,369 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000005_0
2016-08-12T03:17:31,369 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000006_0
2016-08-12T03:17:31,370 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,370 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,370 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@10f4eed3
2016-08-12T03:17:31,370 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,371 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000006_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,372 INFO [localfetcher#7] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#7 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,372 INFO [localfetcher#7] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,372 INFO [localfetcher#7] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,373 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,373 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,373 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,374 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,375 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,376 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,376 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,376 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,376 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,376 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,376 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,383 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,384 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000006_0 is done. And is in the process of committing
2016-08-12T03:17:31,384 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,384 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000006_0 is allowed to commit now
2016-08-12T03:17:31,385 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000006_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000006
2016-08-12T03:17:31,386 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,386 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000006_0' done.
2016-08-12T03:17:31,386 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000006_0
2016-08-12T03:17:31,386 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000007_0
2016-08-12T03:17:31,386 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,386 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,387 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@6bb4eca9
2016-08-12T03:17:31,387 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,387 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000007_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,389 INFO [localfetcher#8] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#8 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,389 INFO [localfetcher#8] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,389 INFO [localfetcher#8] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,390 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,390 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,390 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,391 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,391 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,392 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,392 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,392 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,392 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,392 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,392 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,399 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,400 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000007_0 is done. And is in the process of committing
2016-08-12T03:17:31,400 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,400 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000007_0 is allowed to commit now
2016-08-12T03:17:31,401 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000007_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000007
2016-08-12T03:17:31,401 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,401 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000007_0' done.
2016-08-12T03:17:31,401 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000007_0
2016-08-12T03:17:31,401 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000008_0
2016-08-12T03:17:31,402 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,402 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,402 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@5dd10a74
2016-08-12T03:17:31,402 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,403 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000008_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,404 INFO [localfetcher#9] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#9 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,404 INFO [localfetcher#9] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,404 INFO [localfetcher#9] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,404 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,405 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,405 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,406 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,406 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,406 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,406 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,407 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,407 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,407 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,407 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,414 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,414 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000008_0 is done. And is in the process of committing
2016-08-12T03:17:31,415 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,415 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000008_0 is allowed to commit now
2016-08-12T03:17:31,415 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000008_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000008
2016-08-12T03:17:31,416 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,416 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000008_0' done.
2016-08-12T03:17:31,416 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000008_0
2016-08-12T03:17:31,416 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000009_0
2016-08-12T03:17:31,417 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,417 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,417 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@2ade0088
2016-08-12T03:17:31,417 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,419 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000009_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,420 INFO [localfetcher#10] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#10 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,420 INFO [localfetcher#10] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,420 INFO [localfetcher#10] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,421 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,421 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,421 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,422 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,422 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,423 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,423 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,423 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,423 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,423 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,423 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,430 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,430 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000009_0 is done. And is in the process of committing
2016-08-12T03:17:31,431 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,431 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000009_0 is allowed to commit now
2016-08-12T03:17:31,431 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000009_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000009
2016-08-12T03:17:31,432 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,432 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000009_0' done.
2016-08-12T03:17:31,432 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000009_0
2016-08-12T03:17:31,432 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000010_0
2016-08-12T03:17:31,433 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,433 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,433 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@32bc42f6
2016-08-12T03:17:31,433 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,433 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000010_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,435 INFO [localfetcher#11] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#11 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,435 INFO [localfetcher#11] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,435 INFO [localfetcher#11] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,435 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,436 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,436 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,437 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,437 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,438 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,438 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,438 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,438 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,438 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,438 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,454 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,455 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000010_0 is done. And is in the process of committing
2016-08-12T03:17:31,456 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,456 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000010_0 is allowed to commit now
2016-08-12T03:17:31,456 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000010_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000010
2016-08-12T03:17:31,457 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,457 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000010_0' done.
2016-08-12T03:17:31,457 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000010_0
2016-08-12T03:17:31,457 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000011_0
2016-08-12T03:17:31,458 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,458 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,458 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@7f8445c8
2016-08-12T03:17:31,458 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,459 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000011_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,460 INFO [localfetcher#12] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#12 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,460 INFO [localfetcher#12] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,461 INFO [localfetcher#12] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,461 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,461 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,461 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,462 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,462 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,463 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,463 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,463 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,463 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,463 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,464 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,471 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,472 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000011_0 is done. And is in the process of committing
2016-08-12T03:17:31,472 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,472 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000011_0 is allowed to commit now
2016-08-12T03:17:31,473 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000011_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000011
2016-08-12T03:17:31,473 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,473 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000011_0' done.
2016-08-12T03:17:31,473 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000011_0
2016-08-12T03:17:31,473 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000012_0
2016-08-12T03:17:31,474 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,475 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,475 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@2786f276
2016-08-12T03:17:31,475 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,475 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000012_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,477 INFO [localfetcher#13] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#13 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,477 INFO [localfetcher#13] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,477 INFO [localfetcher#13] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,478 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,478 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,478 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,479 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,479 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,480 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,480 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,480 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,480 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,480 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,480 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,486 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,487 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000012_0 is done. And is in the process of committing
2016-08-12T03:17:31,488 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,488 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000012_0 is allowed to commit now
2016-08-12T03:17:31,488 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000012_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000012
2016-08-12T03:17:31,489 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,489 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000012_0' done.
2016-08-12T03:17:31,489 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000012_0
2016-08-12T03:17:31,489 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000013_0
2016-08-12T03:17:31,489 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,489 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,490 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@57b6c22d
2016-08-12T03:17:31,490 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,490 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000013_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,492 INFO [localfetcher#14] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#14 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,492 INFO [localfetcher#14] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,492 INFO [localfetcher#14] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,492 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,492 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,493 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,493 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,493 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,494 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,494 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,494 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,494 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,494 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,495 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,501 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,501 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000013_0 is done. And is in the process of committing
2016-08-12T03:17:31,502 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,502 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000013_0 is allowed to commit now
2016-08-12T03:17:31,502 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000013_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000013
2016-08-12T03:17:31,503 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,503 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000013_0' done.
2016-08-12T03:17:31,503 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000013_0
2016-08-12T03:17:31,503 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000014_0
2016-08-12T03:17:31,503 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,503 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,503 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@27481fb3
2016-08-12T03:17:31,504 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,504 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000014_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,505 INFO [localfetcher#15] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#15 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,506 INFO [localfetcher#15] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,506 INFO [localfetcher#15] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,506 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,506 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,506 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,507 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,507 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,508 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,508 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,508 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,508 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,508 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,508 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,515 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,515 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000014_0 is done. And is in the process of committing
2016-08-12T03:17:31,516 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,516 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000014_0 is allowed to commit now
2016-08-12T03:17:31,516 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000014_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000014
2016-08-12T03:17:31,517 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,517 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000014_0' done.
2016-08-12T03:17:31,517 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000014_0
2016-08-12T03:17:31,517 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000015_0
2016-08-12T03:17:31,517 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,517 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,518 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@7c6526a7
2016-08-12T03:17:31,518 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,518 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000015_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,520 INFO [localfetcher#16] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#16 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,520 INFO [localfetcher#16] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,520 INFO [localfetcher#16] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,520 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,520 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,520 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,521 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,522 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,522 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,522 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,522 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,522 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,522 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,523 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,529 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,530 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000015_0 is done. And is in the process of committing
2016-08-12T03:17:31,530 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,530 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000015_0 is allowed to commit now
2016-08-12T03:17:31,531 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000015_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000015
2016-08-12T03:17:31,531 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,531 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000015_0' done.
2016-08-12T03:17:31,531 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000015_0
2016-08-12T03:17:31,531 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000016_0
2016-08-12T03:17:31,532 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,532 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,532 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@534c4f10
2016-08-12T03:17:31,532 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,532 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000016_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,534 INFO [localfetcher#17] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#17 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,534 INFO [localfetcher#17] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,534 INFO [localfetcher#17] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,534 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,534 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,534 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,535 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,536 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,536 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,536 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,536 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,536 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,536 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,537 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,543 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,544 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000016_0 is done. And is in the process of committing
2016-08-12T03:17:31,545 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,545 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000016_0 is allowed to commit now
2016-08-12T03:17:31,546 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000016_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000016
2016-08-12T03:17:31,546 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,546 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000016_0' done.
2016-08-12T03:17:31,546 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000016_0
2016-08-12T03:17:31,546 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000017_0
2016-08-12T03:17:31,548 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,548 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,548 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@258507f5
2016-08-12T03:17:31,548 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,548 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000017_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,550 INFO [localfetcher#18] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#18 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,551 INFO [localfetcher#18] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,551 INFO [localfetcher#18] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,551 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,551 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,552 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,553 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,553 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,554 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,554 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,554 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,554 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,554 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,554 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,563 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,564 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000017_0 is done. And is in the process of committing
2016-08-12T03:17:31,564 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,564 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000017_0 is allowed to commit now
2016-08-12T03:17:31,565 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000017_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000017
2016-08-12T03:17:31,565 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,565 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000017_0' done.
2016-08-12T03:17:31,565 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000017_0
2016-08-12T03:17:31,565 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000018_0
2016-08-12T03:17:31,566 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,566 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,566 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@7728b59b
2016-08-12T03:17:31,567 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,567 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000018_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,569 INFO [localfetcher#19] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#19 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,569 INFO [localfetcher#19] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,569 INFO [localfetcher#19] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,569 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,570 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,570 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,571 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,571 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,571 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,571 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,572 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,572 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,572 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,572 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,580 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,580 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000018_0 is done. And is in the process of committing
2016-08-12T03:17:31,581 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,581 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000018_0 is allowed to commit now
2016-08-12T03:17:31,582 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000018_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000018
2016-08-12T03:17:31,582 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,582 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000018_0' done.
2016-08-12T03:17:31,582 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000018_0
2016-08-12T03:17:31,582 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000019_0
2016-08-12T03:17:31,583 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,583 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,583 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@34c6b5c1
2016-08-12T03:17:31,584 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,585 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000019_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,587 INFO [localfetcher#20] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#20 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,587 INFO [localfetcher#20] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,587 INFO [localfetcher#20] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,588 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,589 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,589 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,591 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,591 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,592 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,592 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,592 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,592 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,593 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,593 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,600 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,601 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000019_0 is done. And is in the process of committing
2016-08-12T03:17:31,601 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,601 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000019_0 is allowed to commit now
2016-08-12T03:17:31,602 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000019_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000019
2016-08-12T03:17:31,602 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,602 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000019_0' done.
2016-08-12T03:17:31,602 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000019_0
2016-08-12T03:17:31,602 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000020_0
2016-08-12T03:17:31,603 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,603 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,603 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@2a450e7e
2016-08-12T03:17:31,603 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,604 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000020_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,605 INFO [localfetcher#21] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#21 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,605 INFO [localfetcher#21] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,605 INFO [localfetcher#21] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,606 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,606 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,606 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,607 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,607 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,608 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,608 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,608 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,608 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,609 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,609 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,615 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,615 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000020_0 is done. And is in the process of committing
2016-08-12T03:17:31,616 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,616 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000020_0 is allowed to commit now
2016-08-12T03:17:31,616 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000020_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000020
2016-08-12T03:17:31,617 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,617 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000020_0' done.
2016-08-12T03:17:31,617 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000020_0
2016-08-12T03:17:31,617 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000021_0
2016-08-12T03:17:31,618 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,618 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,618 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@1fd48e27
2016-08-12T03:17:31,618 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,619 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000021_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,620 INFO [localfetcher#22] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#22 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,620 INFO [localfetcher#22] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,620 INFO [localfetcher#22] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,620 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,621 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,621 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,622 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,622 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,622 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,623 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,623 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,623 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,623 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,623 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,630 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,630 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000021_0 is done. And is in the process of committing
2016-08-12T03:17:31,631 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,631 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000021_0 is allowed to commit now
2016-08-12T03:17:31,632 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000021_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000021
2016-08-12T03:17:31,632 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,632 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000021_0' done.
2016-08-12T03:17:31,632 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000021_0
2016-08-12T03:17:31,632 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000022_0
2016-08-12T03:17:31,633 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,633 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,633 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@1ab0d58f
2016-08-12T03:17:31,633 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,633 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000022_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,635 INFO [localfetcher#23] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#23 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,635 INFO [localfetcher#23] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,635 INFO [localfetcher#23] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,635 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,636 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,636 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,637 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,637 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,637 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,638 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,638 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,638 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,638 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,638 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,645 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,646 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000022_0 is done. And is in the process of committing
2016-08-12T03:17:31,646 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,646 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000022_0 is allowed to commit now
2016-08-12T03:17:31,647 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000022_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000022
2016-08-12T03:17:31,647 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,647 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000022_0' done.
2016-08-12T03:17:31,647 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000022_0
2016-08-12T03:17:31,647 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000023_0
2016-08-12T03:17:31,647 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,647 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,647 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@55a3192b
2016-08-12T03:17:31,648 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,648 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000023_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,649 INFO [localfetcher#24] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#24 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,650 INFO [localfetcher#24] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,650 INFO [localfetcher#24] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,650 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,651 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,651 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,652 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,652 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,653 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,654 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,654 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,654 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,654 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,654 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,662 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,663 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000023_0 is done. And is in the process of committing
2016-08-12T03:17:31,664 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,664 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000023_0 is allowed to commit now
2016-08-12T03:17:31,664 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000023_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000023
2016-08-12T03:17:31,664 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,664 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000023_0' done.
2016-08-12T03:17:31,664 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000023_0
2016-08-12T03:17:31,664 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000024_0
2016-08-12T03:17:31,665 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,665 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,665 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@716f3df9
2016-08-12T03:17:31,665 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,666 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000024_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,668 INFO [localfetcher#25] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#25 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,668 INFO [localfetcher#25] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,668 INFO [localfetcher#25] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,668 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,669 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,669 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,670 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,670 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,670 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,671 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,671 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,671 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,671 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,671 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,677 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,678 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000024_0 is done. And is in the process of committing
2016-08-12T03:17:31,678 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,678 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000024_0 is allowed to commit now
2016-08-12T03:17:31,679 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000024_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000024
2016-08-12T03:17:31,679 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,679 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000024_0' done.
2016-08-12T03:17:31,679 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000024_0
2016-08-12T03:17:31,679 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000025_0
2016-08-12T03:17:31,680 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,680 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,680 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@3ac8efd1
2016-08-12T03:17:31,680 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,681 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000025_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,682 INFO [localfetcher#26] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#26 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,683 INFO [localfetcher#26] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,683 INFO [localfetcher#26] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,683 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,683 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,683 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,684 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,685 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,685 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,686 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,686 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,686 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,686 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,686 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,694 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,695 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000025_0 is done. And is in the process of committing
2016-08-12T03:17:31,696 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,696 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000025_0 is allowed to commit now
2016-08-12T03:17:31,696 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000025_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000025
2016-08-12T03:17:31,696 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,696 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000025_0' done.
2016-08-12T03:17:31,696 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000025_0
2016-08-12T03:17:31,697 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000026_0
2016-08-12T03:17:31,697 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,697 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,697 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@2335cd34
2016-08-12T03:17:31,697 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,698 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000026_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,699 INFO [localfetcher#27] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#27 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,700 INFO [localfetcher#27] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,700 INFO [localfetcher#27] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,700 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,701 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,701 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,702 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,702 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,703 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,703 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,703 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,703 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,704 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,704 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,712 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,713 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000026_0 is done. And is in the process of committing
2016-08-12T03:17:31,713 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,713 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000026_0 is allowed to commit now
2016-08-12T03:17:31,714 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000026_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000026
2016-08-12T03:17:31,714 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,714 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000026_0' done.
2016-08-12T03:17:31,714 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000026_0
2016-08-12T03:17:31,714 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000027_0
2016-08-12T03:17:31,715 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,715 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,715 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@574a5a08
2016-08-12T03:17:31,715 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,715 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000027_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,716 INFO [localfetcher#28] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#28 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,717 INFO [localfetcher#28] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,717 INFO [localfetcher#28] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,717 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,718 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,718 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,719 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,719 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,720 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,720 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,720 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,720 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,721 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,721 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,731 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,732 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000027_0 is done. And is in the process of committing
2016-08-12T03:17:31,732 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,732 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000027_0 is allowed to commit now
2016-08-12T03:17:31,733 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000027_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000027
2016-08-12T03:17:31,733 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,733 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000027_0' done.
2016-08-12T03:17:31,733 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000027_0
2016-08-12T03:17:31,733 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000028_0
2016-08-12T03:17:31,734 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,734 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,734 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@1c0ae189
2016-08-12T03:17:31,734 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,735 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000028_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,736 INFO [localfetcher#29] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#29 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,736 INFO [localfetcher#29] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,736 INFO [localfetcher#29] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,737 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,737 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,737 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,738 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,739 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,739 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,739 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,739 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,739 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,740 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,740 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,746 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,746 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000028_0 is done. And is in the process of committing
2016-08-12T03:17:31,747 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,747 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000028_0 is allowed to commit now
2016-08-12T03:17:31,747 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000028_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000028
2016-08-12T03:17:31,747 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,747 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000028_0' done.
2016-08-12T03:17:31,747 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000028_0
2016-08-12T03:17:31,747 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000029_0
2016-08-12T03:17:31,748 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,748 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,748 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@3fc5ed41
2016-08-12T03:17:31,748 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,749 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000029_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,750 INFO [localfetcher#30] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#30 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,750 INFO [localfetcher#30] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,750 INFO [localfetcher#30] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,751 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,751 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,751 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,752 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,752 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,753 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,753 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,753 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,753 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,753 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,753 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,761 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,762 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000029_0 is done. And is in the process of committing
2016-08-12T03:17:31,762 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,762 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000029_0 is allowed to commit now
2016-08-12T03:17:31,763 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000029_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000029
2016-08-12T03:17:31,763 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,763 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000029_0' done.
2016-08-12T03:17:31,763 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000029_0
2016-08-12T03:17:31,763 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000030_0
2016-08-12T03:17:31,763 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,763 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,764 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@3b8b9d70
2016-08-12T03:17:31,764 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,764 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000030_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,766 INFO [localfetcher#31] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#31 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,766 INFO [localfetcher#31] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,766 INFO [localfetcher#31] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,767 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,767 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,767 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,768 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,768 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,769 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,769 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,769 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,769 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,769 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,769 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,776 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,777 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000030_0 is done. And is in the process of committing
2016-08-12T03:17:31,778 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,778 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000030_0 is allowed to commit now
2016-08-12T03:17:31,778 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000030_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000030
2016-08-12T03:17:31,778 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,778 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000030_0' done.
2016-08-12T03:17:31,779 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000030_0
2016-08-12T03:17:31,779 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000031_0
2016-08-12T03:17:31,779 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,779 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,779 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@4257eaca
2016-08-12T03:17:31,780 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,780 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000031_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,781 INFO [localfetcher#32] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#32 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,781 INFO [localfetcher#32] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,781 INFO [localfetcher#32] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,782 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,782 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,782 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,783 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,783 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,784 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,785 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,785 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,785 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,785 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,785 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,795 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,796 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000031_0 is done. And is in the process of committing
2016-08-12T03:17:31,796 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,796 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000031_0 is allowed to commit now
2016-08-12T03:17:31,797 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000031_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000031
2016-08-12T03:17:31,797 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,797 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000031_0' done.
2016-08-12T03:17:31,797 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000031_0
2016-08-12T03:17:31,797 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000032_0
2016-08-12T03:17:31,798 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,798 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,798 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@64b547af
2016-08-12T03:17:31,798 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,798 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000032_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,799 INFO [localfetcher#33] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#33 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,800 INFO [localfetcher#33] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,800 INFO [localfetcher#33] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,800 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,800 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,800 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,801 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,801 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,802 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,802 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,802 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,802 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,802 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,802 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,804 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - map 100% reduce 57%
2016-08-12T03:17:31,813 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,813 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000032_0 is done. And is in the process of committing
2016-08-12T03:17:31,814 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,814 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000032_0 is allowed to commit now
2016-08-12T03:17:31,814 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000032_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000032
2016-08-12T03:17:31,814 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,814 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000032_0' done.
2016-08-12T03:17:31,814 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000032_0
2016-08-12T03:17:31,814 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000033_0
2016-08-12T03:17:31,815 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,815 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,815 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@e67f53b
2016-08-12T03:17:31,815 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,816 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000033_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,817 INFO [localfetcher#34] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#34 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,817 INFO [localfetcher#34] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,817 INFO [localfetcher#34] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,817 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,818 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,818 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,819 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,819 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,819 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,819 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,819 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,819 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,820 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,820 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,827 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,827 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000033_0 is done. And is in the process of committing
2016-08-12T03:17:31,828 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,828 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000033_0 is allowed to commit now
2016-08-12T03:17:31,828 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000033_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000033
2016-08-12T03:17:31,828 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,828 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000033_0' done.
2016-08-12T03:17:31,828 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000033_0
2016-08-12T03:17:31,828 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000034_0
2016-08-12T03:17:31,829 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,829 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,829 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@52d4ad5b
2016-08-12T03:17:31,829 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,830 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000034_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,831 INFO [localfetcher#35] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#35 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,831 INFO [localfetcher#35] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,831 INFO [localfetcher#35] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,832 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,832 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,832 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,833 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,833 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,834 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,834 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,834 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,834 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,834 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,834 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,840 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,841 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000034_0 is done. And is in the process of committing
2016-08-12T03:17:31,841 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,841 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000034_0 is allowed to commit now
2016-08-12T03:17:31,841 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000034_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000034
2016-08-12T03:17:31,842 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,842 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000034_0' done.
2016-08-12T03:17:31,842 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000034_0
2016-08-12T03:17:31,842 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000035_0
2016-08-12T03:17:31,842 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,842 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,842 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@4a984153
2016-08-12T03:17:31,843 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,843 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000035_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,844 INFO [localfetcher#36] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#36 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,845 INFO [localfetcher#36] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,845 INFO [localfetcher#36] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,845 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,845 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,845 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,846 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,847 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,847 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,847 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,847 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,847 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,847 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,848 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,854 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,854 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000035_0 is done. And is in the process of committing
2016-08-12T03:17:31,855 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,855 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000035_0 is allowed to commit now
2016-08-12T03:17:31,855 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000035_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000035
2016-08-12T03:17:31,855 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,855 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000035_0' done.
2016-08-12T03:17:31,855 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000035_0
2016-08-12T03:17:31,855 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000036_0
2016-08-12T03:17:31,856 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,856 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,856 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@310592ff
2016-08-12T03:17:31,856 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,856 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000036_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,857 INFO [localfetcher#37] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#37 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,857 INFO [localfetcher#37] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,858 INFO [localfetcher#37] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,858 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,858 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,858 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,859 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,859 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,860 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,860 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,860 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,860 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,860 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,860 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,866 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,866 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000036_0 is done. And is in the process of committing
2016-08-12T03:17:31,867 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,867 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000036_0 is allowed to commit now
2016-08-12T03:17:31,867 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000036_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000036
2016-08-12T03:17:31,868 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,868 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000036_0' done.
2016-08-12T03:17:31,868 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000036_0
2016-08-12T03:17:31,868 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000037_0
2016-08-12T03:17:31,868 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,868 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,868 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@7f573a70
2016-08-12T03:17:31,869 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,869 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000037_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,870 INFO [localfetcher#38] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#38 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,870 INFO [localfetcher#38] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,870 INFO [localfetcher#38] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,871 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,871 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,871 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,872 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,872 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,872 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,872 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,872 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,872 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,873 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,873 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,878 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,879 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000037_0 is done. And is in the process of committing
2016-08-12T03:17:31,879 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,879 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000037_0 is allowed to commit now
2016-08-12T03:17:31,880 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000037_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000037
2016-08-12T03:17:31,880 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,880 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000037_0' done.
2016-08-12T03:17:31,880 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000037_0
2016-08-12T03:17:31,880 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000038_0
2016-08-12T03:17:31,880 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,880 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,880 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@46b89370
2016-08-12T03:17:31,881 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,881 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000038_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,882 INFO [localfetcher#39] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#39 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,882 INFO [localfetcher#39] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,882 INFO [localfetcher#39] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,883 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,883 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,883 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,884 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,885 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,885 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,885 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,885 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,885 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,885 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,885 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,891 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,892 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000038_0 is done. And is in the process of committing
2016-08-12T03:17:31,892 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,892 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000038_0 is allowed to commit now
2016-08-12T03:17:31,893 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000038_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000038
2016-08-12T03:17:31,893 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,893 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000038_0' done.
2016-08-12T03:17:31,893 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000038_0
2016-08-12T03:17:31,893 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000039_0
2016-08-12T03:17:31,893 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,893 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,893 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@1cd3fcdb
2016-08-12T03:17:31,894 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,894 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000039_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,895 INFO [localfetcher#40] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#40 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,895 INFO [localfetcher#40] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,895 INFO [localfetcher#40] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,896 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,896 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,896 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,897 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,897 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,898 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,898 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,898 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,898 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,898 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,898 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,904 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,905 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000039_0 is done. And is in the process of committing
2016-08-12T03:17:31,905 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,906 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000039_0 is allowed to commit now
2016-08-12T03:17:31,906 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000039_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000039
2016-08-12T03:17:31,906 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,906 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000039_0' done.
2016-08-12T03:17:31,906 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000039_0
2016-08-12T03:17:31,906 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000040_0
2016-08-12T03:17:31,907 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,907 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,907 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@7e211b49
2016-08-12T03:17:31,907 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,907 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000040_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,908 INFO [localfetcher#41] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#41 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,908 INFO [localfetcher#41] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,908 INFO [localfetcher#41] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,909 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,909 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,909 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,909 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,910 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,910 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,910 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,910 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,910 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,911 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,911 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,917 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,917 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000040_0 is done. And is in the process of committing
2016-08-12T03:17:31,917 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,918 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000040_0 is allowed to commit now
2016-08-12T03:17:31,918 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000040_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000040
2016-08-12T03:17:31,918 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,918 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000040_0' done.
2016-08-12T03:17:31,918 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000040_0
2016-08-12T03:17:31,918 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000041_0
2016-08-12T03:17:31,919 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,919 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,919 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@29989a35
2016-08-12T03:17:31,919 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,919 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000041_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,920 INFO [localfetcher#42] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#42 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,920 INFO [localfetcher#42] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,920 INFO [localfetcher#42] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,920 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,921 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,921 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,921 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,921 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,922 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,922 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,922 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,922 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,922 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,922 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,929 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,930 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000041_0 is done. And is in the process of committing
2016-08-12T03:17:31,930 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,930 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000041_0 is allowed to commit now
2016-08-12T03:17:31,930 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000041_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000041
2016-08-12T03:17:31,931 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,931 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000041_0' done.
2016-08-12T03:17:31,931 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000041_0
2016-08-12T03:17:31,931 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000042_0
2016-08-12T03:17:31,931 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,931 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,931 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@22cdec4a
2016-08-12T03:17:31,932 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,932 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000042_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,933 INFO [localfetcher#43] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#43 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,933 INFO [localfetcher#43] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,933 INFO [localfetcher#43] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,933 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,934 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,934 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,934 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,934 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,935 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,935 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,935 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,935 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,935 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,935 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,943 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,944 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000042_0 is done. And is in the process of committing
2016-08-12T03:17:31,945 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,945 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000042_0 is allowed to commit now
2016-08-12T03:17:31,945 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000042_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000042
2016-08-12T03:17:31,945 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,946 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000042_0' done.
2016-08-12T03:17:31,946 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000042_0
2016-08-12T03:17:31,946 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000043_0
2016-08-12T03:17:31,946 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,946 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,946 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@116db064
2016-08-12T03:17:31,947 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,947 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000043_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,948 INFO [localfetcher#44] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#44 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,948 INFO [localfetcher#44] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,948 INFO [localfetcher#44] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,948 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,949 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,949 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,949 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,949 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,950 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,950 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,950 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,950 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,950 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,951 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,957 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,958 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000043_0 is done. And is in the process of committing
2016-08-12T03:17:31,958 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,958 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000043_0 is allowed to commit now
2016-08-12T03:17:31,959 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000043_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000043
2016-08-12T03:17:31,959 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,959 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000043_0' done.
2016-08-12T03:17:31,959 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000043_0
2016-08-12T03:17:31,959 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000044_0
2016-08-12T03:17:31,959 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,959 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,959 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@aef1188
2016-08-12T03:17:31,960 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,960 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000044_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,961 INFO [localfetcher#45] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#45 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,961 INFO [localfetcher#45] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,961 INFO [localfetcher#45] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,961 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,961 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,961 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,967 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,967 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,968 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,968 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,968 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,968 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,969 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,969 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,976 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,976 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000044_0 is done. And is in the process of committing
2016-08-12T03:17:31,977 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,977 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000044_0 is allowed to commit now
2016-08-12T03:17:31,977 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000044_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000044
2016-08-12T03:17:31,977 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,977 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000044_0' done.
2016-08-12T03:17:31,977 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000044_0
2016-08-12T03:17:31,978 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000045_0
2016-08-12T03:17:31,978 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,978 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,978 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@44219349
2016-08-12T03:17:31,978 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,979 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000045_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,980 INFO [localfetcher#46] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#46 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,980 INFO [localfetcher#46] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,980 INFO [localfetcher#46] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,980 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,981 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,981 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,981 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,981 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,982 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,982 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,982 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,982 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,983 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,983 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,989 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:31,990 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000045_0 is done. And is in the process of committing
2016-08-12T03:17:31,990 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,990 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000045_0 is allowed to commit now
2016-08-12T03:17:31,991 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000045_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000045
2016-08-12T03:17:31,991 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:31,991 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000045_0' done.
2016-08-12T03:17:31,991 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000045_0
2016-08-12T03:17:31,991 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000046_0
2016-08-12T03:17:31,991 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:31,991 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:31,991 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@19460f93
2016-08-12T03:17:31,992 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:31,992 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000046_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:31,993 INFO [localfetcher#47] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#47 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:31,993 INFO [localfetcher#47] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:31,993 INFO [localfetcher#47] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:31,993 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:31,993 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:31,993 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:31,994 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,994 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,994 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:31,995 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:31,995 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:31,995 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:31,995 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:31,995 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,001 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,001 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000046_0 is done. And is in the process of committing
2016-08-12T03:17:32,001 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,001 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000046_0 is allowed to commit now
2016-08-12T03:17:32,002 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000046_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000046
2016-08-12T03:17:32,002 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,002 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000046_0' done.
2016-08-12T03:17:32,002 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000046_0
2016-08-12T03:17:32,002 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000047_0
2016-08-12T03:17:32,002 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:32,002 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:32,002 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@7674afe0
2016-08-12T03:17:32,003 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:32,003 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000047_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:32,004 INFO [localfetcher#48] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#48 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:32,004 INFO [localfetcher#48] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:32,004 INFO [localfetcher#48] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:32,004 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:32,004 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,004 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:32,005 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,005 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,005 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:32,005 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:32,005 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:32,005 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,006 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,006 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,011 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,012 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000047_0 is done. And is in the process of committing
2016-08-12T03:17:32,012 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,012 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000047_0 is allowed to commit now
2016-08-12T03:17:32,012 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000047_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000047
2016-08-12T03:17:32,012 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,013 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000047_0' done.
2016-08-12T03:17:32,013 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000047_0
2016-08-12T03:17:32,013 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000048_0
2016-08-12T03:17:32,013 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:32,013 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:32,013 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@de37e24
2016-08-12T03:17:32,013 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:32,014 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000048_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:32,014 INFO [localfetcher#49] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#49 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:32,014 INFO [localfetcher#49] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:32,014 INFO [localfetcher#49] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:32,015 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:32,015 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,015 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:32,016 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,016 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,016 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:32,016 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:32,016 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:32,016 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,016 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,016 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,022 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,023 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000048_0 is done. And is in the process of committing
2016-08-12T03:17:32,023 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,023 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000048_0 is allowed to commit now
2016-08-12T03:17:32,024 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000048_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000048
2016-08-12T03:17:32,024 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,024 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000048_0' done.
2016-08-12T03:17:32,024 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000048_0
2016-08-12T03:17:32,024 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000049_0
2016-08-12T03:17:32,024 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:32,024 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:32,024 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@5ef9c089
2016-08-12T03:17:32,025 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:32,025 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000049_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:32,026 INFO [localfetcher#50] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#50 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:32,026 INFO [localfetcher#50] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:32,026 INFO [localfetcher#50] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:32,026 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:32,026 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,027 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:32,027 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,027 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,028 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:32,028 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:32,028 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:32,028 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,028 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,029 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,034 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,035 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000049_0 is done. And is in the process of committing
2016-08-12T03:17:32,035 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,035 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000049_0 is allowed to commit now
2016-08-12T03:17:32,035 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000049_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000049
2016-08-12T03:17:32,036 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,036 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000049_0' done.
2016-08-12T03:17:32,036 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000049_0
2016-08-12T03:17:32,036 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000050_0
2016-08-12T03:17:32,036 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:32,036 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:32,036 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@6a927227
2016-08-12T03:17:32,036 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:32,037 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000050_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:32,037 INFO [localfetcher#51] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#51 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:32,038 INFO [localfetcher#51] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:32,038 INFO [localfetcher#51] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:32,038 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:32,038 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,038 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:32,039 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,039 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,039 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:32,039 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:32,039 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:32,039 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,040 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,040 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,045 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,046 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000050_0 is done. And is in the process of committing
2016-08-12T03:17:32,046 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,046 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000050_0 is allowed to commit now
2016-08-12T03:17:32,046 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000050_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000050
2016-08-12T03:17:32,046 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,047 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000050_0' done.
2016-08-12T03:17:32,047 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000050_0
2016-08-12T03:17:32,047 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000051_0
2016-08-12T03:17:32,047 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:32,047 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:32,047 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@7cf3f860
2016-08-12T03:17:32,047 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:32,047 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000051_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:32,048 INFO [localfetcher#52] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#52 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:32,048 INFO [localfetcher#52] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:32,048 INFO [localfetcher#52] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:32,048 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:32,049 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,049 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:32,049 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,049 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,050 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:32,050 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:32,050 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:32,050 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,050 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,050 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,056 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,056 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000051_0 is done. And is in the process of committing
2016-08-12T03:17:32,056 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,056 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000051_0 is allowed to commit now
2016-08-12T03:17:32,057 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000051_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000051
2016-08-12T03:17:32,057 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,057 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000051_0' done.
2016-08-12T03:17:32,057 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000051_0
2016-08-12T03:17:32,057 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000052_0
2016-08-12T03:17:32,058 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:32,058 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:32,058 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@485c79a1
2016-08-12T03:17:32,058 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:32,058 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000052_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:32,059 INFO [localfetcher#53] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#53 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:32,059 INFO [localfetcher#53] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:32,059 INFO [localfetcher#53] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:32,059 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:32,059 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,059 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:32,060 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,060 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,060 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:32,060 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:32,061 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:32,061 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,061 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,061 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,066 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,067 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000052_0 is done. And is in the process of committing
2016-08-12T03:17:32,067 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,067 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000052_0 is allowed to commit now
2016-08-12T03:17:32,067 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000052_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000052
2016-08-12T03:17:32,067 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,067 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000052_0' done.
2016-08-12T03:17:32,067 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000052_0
2016-08-12T03:17:32,067 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000053_0
2016-08-12T03:17:32,068 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:32,068 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:32,068 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@338eab4b
2016-08-12T03:17:32,068 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:32,068 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000053_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:32,069 INFO [localfetcher#54] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#54 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:32,069 INFO [localfetcher#54] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:32,069 INFO [localfetcher#54] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:32,069 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:32,069 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,070 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:32,070 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,070 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,071 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:32,071 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:32,071 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:32,071 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,071 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,071 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,077 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,077 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000053_0 is done. And is in the process of committing
2016-08-12T03:17:32,078 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,078 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000053_0 is allowed to commit now
2016-08-12T03:17:32,078 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000053_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000053
2016-08-12T03:17:32,078 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,078 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000053_0' done.
2016-08-12T03:17:32,078 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000053_0
2016-08-12T03:17:32,078 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000054_0
2016-08-12T03:17:32,079 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:32,079 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:32,079 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@18a52c87
2016-08-12T03:17:32,079 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:32,079 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000054_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:32,080 INFO [localfetcher#55] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#55 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:32,080 INFO [localfetcher#55] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:32,080 INFO [localfetcher#55] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:32,080 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:32,080 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,080 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:32,081 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,081 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,081 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:32,082 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:32,082 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:32,082 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,082 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,082 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,087 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,088 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000054_0 is done. And is in the process of committing
2016-08-12T03:17:32,088 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,088 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000054_0 is allowed to commit now
2016-08-12T03:17:32,088 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000054_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000054
2016-08-12T03:17:32,088 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,088 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000054_0' done.
2016-08-12T03:17:32,088 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000054_0
2016-08-12T03:17:32,088 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local895013469_0001_r_000055_0
2016-08-12T03:17:32,089 INFO [pool-19-thread-1] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2016-08-12T03:17:32,089 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : null
2016-08-12T03:17:32,089 INFO [pool-19-thread-1] org.apache.hadoop.mapred.ReduceTask - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@651fa6bb
2016-08-12T03:17:32,089 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - MergerManager: memoryLimit=1336252800, maxSingleShuffleLimit=334063200, mergeThreshold=881926912, ioSortFactor=10, memToMemMergeOutputsThreshold=10
2016-08-12T03:17:32,089 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - attempt_local895013469_0001_r_000055_0 Thread started: EventFetcher for fetching Map Completion Events
2016-08-12T03:17:32,090 INFO [localfetcher#56] org.apache.hadoop.mapreduce.task.reduce.LocalFetcher - localfetcher#56 about to shuffle output of map attempt_local895013469_0001_m_000000_0 decomp: 2 len: 6 to MEMORY
2016-08-12T03:17:32,090 INFO [localfetcher#56] org.apache.hadoop.mapreduce.task.reduce.InMemoryMapOutput - Read 2 bytes from map-output for attempt_local895013469_0001_m_000000_0
2016-08-12T03:17:32,090 INFO [localfetcher#56] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - closeInMemoryFile -> map-output of size: 2, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->2
2016-08-12T03:17:32,090 INFO [EventFetcher for fetching Map Completion Events] org.apache.hadoop.mapreduce.task.reduce.EventFetcher - EventFetcher is interrupted.. Returning
2016-08-12T03:17:32,091 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,091 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
2016-08-12T03:17:32,091 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,091 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,092 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merged 1 segments, 2 bytes to disk to satisfy reduce memory limit
2016-08-12T03:17:32,092 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 1 files, 6 bytes from disk
2016-08-12T03:17:32,092 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.task.reduce.MergeManagerImpl - Merging 0 segments, 0 bytes from memory into reduce
2016-08-12T03:17:32,092 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Merging 1 sorted segments
2016-08-12T03:17:32,092 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Merger - Down to the last merge-pass, with 0 segments left of total size: 0 bytes
2016-08-12T03:17:32,092 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,098 INFO [pool-19-thread-1] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.354Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : { },
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:32,099 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task:attempt_local895013469_0001_r_000055_0 is done. And is in the process of committing
2016-08-12T03:17:32,099 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - 1 / 1 copied.
2016-08-12T03:17:32,099 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task attempt_local895013469_0001_r_000055_0 is allowed to commit now
2016-08-12T03:17:32,099 INFO [pool-19-thread-1] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local895013469_0001_r_000055_0' to file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414/groupedData/_temporary/0/task_local895013469_0001_r_000055
2016-08-12T03:17:32,099 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - reduce > reduce
2016-08-12T03:17:32,099 INFO [pool-19-thread-1] org.apache.hadoop.mapred.Task - Task 'attempt_local895013469_0001_r_000055_0' done.
2016-08-12T03:17:32,099 INFO [pool-19-thread-1] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local895013469_0001_r_000055_0
2016-08-12T03:17:32,099 INFO [Thread-53] org.apache.hadoop.mapred.LocalJobRunner - reduce task executor complete.
2016-08-12T03:17:32,810 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - map 100% reduce 100%
2016-08-12T03:17:32,810 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Job job_local895013469_0001 completed successfully
2016-08-12T03:17:32,862 INFO [task-runner-0-priority-0] org.apache.hadoop.mapreduce.Job - Counters: 33
File System Counters
FILE: Number of bytes read=3170724
FILE: Number of bytes written=13621780
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
Map-Reduce Framework
Map input records=1
Map output records=56
Map output bytes=1064
Map output materialized bytes=1512
Input split bytes=390
Combine input records=0
Combine output records=0
Reduce input groups=56
Reduce shuffle bytes=1512
Reduce input records=56
Reduce output records=0
Spilled Records=112
Shuffled Maps =56
Failed Shuffles=0
Merged Map outputs=56
GC time elapsed (ms)=233
Total committed heap usage (bytes)=49906974720
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=5936
parquet
bytesread=1389
bytestotal=1491
timeread=52
2016-08-12T03:17:32,863 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Job completed, loading up partitions for intervals[Optional.of([2016-06-16T00:00:00.000Z/2016-06-17T00:00:00.000Z, 2016-06-17T00:00:00.000Z/2016-06-18T00:00:00.000Z, 2016-06-18T00:00:00.000Z/2016-06-19T00:00:00.000Z, 2016-06-19T00:00:00.000Z/2016-06-20T00:00:00.000Z, 2016-06-20T00:00:00.000Z/2016-06-21T00:00:00.000Z, 2016-06-21T00:00:00.000Z/2016-06-22T00:00:00.000Z, 2016-06-22T00:00:00.000Z/2016-06-23T00:00:00.000Z, 2016-06-23T00:00:00.000Z/2016-06-24T00:00:00.000Z, 2016-06-24T00:00:00.000Z/2016-06-25T00:00:00.000Z, 2016-06-25T00:00:00.000Z/2016-06-26T00:00:00.000Z, 2016-06-26T00:00:00.000Z/2016-06-27T00:00:00.000Z, 2016-06-27T00:00:00.000Z/2016-06-28T00:00:00.000Z, 2016-06-28T00:00:00.000Z/2016-06-29T00:00:00.000Z, 2016-06-29T00:00:00.000Z/2016-06-30T00:00:00.000Z, 2016-06-30T00:00:00.000Z/2016-07-01T00:00:00.000Z, 2016-07-01T00:00:00.000Z/2016-07-02T00:00:00.000Z, 2016-07-02T00:00:00.000Z/2016-07-03T00:00:00.000Z, 2016-07-03T00:00:00.000Z/2016-07-04T00:00:00.000Z, 2016-07-04T00:00:00.000Z/2016-07-05T00:00:00.000Z, 2016-07-05T00:00:00.000Z/2016-07-06T00:00:00.000Z, 2016-07-06T00:00:00.000Z/2016-07-07T00:00:00.000Z, 2016-07-07T00:00:00.000Z/2016-07-08T00:00:00.000Z, 2016-07-08T00:00:00.000Z/2016-07-09T00:00:00.000Z, 2016-07-09T00:00:00.000Z/2016-07-10T00:00:00.000Z, 2016-07-10T00:00:00.000Z/2016-07-11T00:00:00.000Z, 2016-07-11T00:00:00.000Z/2016-07-12T00:00:00.000Z, 2016-07-12T00:00:00.000Z/2016-07-13T00:00:00.000Z, 2016-07-13T00:00:00.000Z/2016-07-14T00:00:00.000Z, 2016-07-14T00:00:00.000Z/2016-07-15T00:00:00.000Z, 2016-07-15T00:00:00.000Z/2016-07-16T00:00:00.000Z, 2016-07-16T00:00:00.000Z/2016-07-17T00:00:00.000Z, 2016-07-17T00:00:00.000Z/2016-07-18T00:00:00.000Z, 2016-07-18T00:00:00.000Z/2016-07-19T00:00:00.000Z, 2016-07-19T00:00:00.000Z/2016-07-20T00:00:00.000Z, 2016-07-20T00:00:00.000Z/2016-07-21T00:00:00.000Z, 2016-07-21T00:00:00.000Z/2016-07-22T00:00:00.000Z, 2016-07-22T00:00:00.000Z/2016-07-23T00:00:00.000Z, 2016-07-23T00:00:00.000Z/2016-07-24T00:00:00.000Z, 2016-07-24T00:00:00.000Z/2016-07-25T00:00:00.000Z, 2016-07-25T00:00:00.000Z/2016-07-26T00:00:00.000Z, 2016-07-26T00:00:00.000Z/2016-07-27T00:00:00.000Z, 2016-07-27T00:00:00.000Z/2016-07-28T00:00:00.000Z, 2016-07-28T00:00:00.000Z/2016-07-29T00:00:00.000Z, 2016-07-29T00:00:00.000Z/2016-07-30T00:00:00.000Z, 2016-07-30T00:00:00.000Z/2016-07-31T00:00:00.000Z, 2016-07-31T00:00:00.000Z/2016-08-01T00:00:00.000Z, 2016-08-01T00:00:00.000Z/2016-08-02T00:00:00.000Z, 2016-08-02T00:00:00.000Z/2016-08-03T00:00:00.000Z, 2016-08-03T00:00:00.000Z/2016-08-04T00:00:00.000Z, 2016-08-04T00:00:00.000Z/2016-08-05T00:00:00.000Z, 2016-08-05T00:00:00.000Z/2016-08-06T00:00:00.000Z, 2016-08-06T00:00:00.000Z/2016-08-07T00:00:00.000Z, 2016-08-07T00:00:00.000Z/2016-08-08T00:00:00.000Z, 2016-08-08T00:00:00.000Z/2016-08-09T00:00:00.000Z, 2016-08-09T00:00:00.000Z/2016-08-10T00:00:00.000Z, 2016-08-10T00:00:00.000Z/2016-08-11T00:00:00.000Z])].
2016-08-12T03:17:32,871 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,871 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,873 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,874 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,875 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,875 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,876 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,876 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,876 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,876 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,877 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,877 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,878 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,878 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,878 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,878 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,879 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,879 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,880 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,880 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,880 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,880 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,881 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,881 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,881 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,881 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,882 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,882 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,882 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,882 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,883 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,883 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,883 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,883 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,884 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,884 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,885 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,885 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,885 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,885 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,886 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,886 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,887 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,887 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,888 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,888 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,888 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,888 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,889 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,889 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,890 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,890 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,890 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,890 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,891 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,891 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,892 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,892 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,892 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,893 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,893 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,893 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,894 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,894 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,895 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,895 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,895 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,895 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,896 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,896 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,896 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,896 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,897 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,897 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,898 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,898 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,898 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,898 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,899 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,899 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,900 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,900 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,900 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,900 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,901 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,901 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,902 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,902 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,902 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,902 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,903 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,903 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,904 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,904 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,904 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,904 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,905 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,905 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,905 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,906 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,906 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,906 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,907 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,907 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,908 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,908 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,908 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,909 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,909 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,909 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,910 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Found approximately [0] rows in data.
2016-08-12T03:17:32,910 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - Creating [0] shards
2016-08-12T03:17:32,910 INFO [task-runner-0-priority-0] io.druid.indexer.DetermineHashedPartitionsJob - DetermineHashedPartitionsJob took 3550 millis
2016-08-12T03:17:32,910 INFO [task-runner-0-priority-0] io.druid.indexer.JobHelper - Deleting path[var/druid/hadoop-tmp/no_metrics/2016-08-12T031723.354Z/09396ba7b7364e04ae0567ab43bca414]
2016-08-12T03:17:32,987 INFO [task-runner-0-priority-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Performing action for task[index_hadoop_no_metrics_2016-08-12T03:17:23.355Z]: LockListAction{}
2016-08-12T03:17:32,989 INFO [task-runner-0-priority-0] io.druid.indexing.common.actions.RemoteTaskActionClient - Submitting action for task[index_hadoop_no_metrics_2016-08-12T03:17:23.355Z] to overlord[http://192.168.3.30:8090/druid/indexer/v1/action]: LockListAction{}
2016-08-12T03:17:32,996 INFO [task-runner-0-priority-0] io.druid.indexing.common.task.HadoopIndexTask - Setting version to: 2016-08-12T03:17:23.360Z
2016-08-12T03:17:33,005 INFO [task-runner-0-priority-0] io.druid.indexer.HadoopDruidIndexerConfig - Running with config:
{
"spec" : {
"dataSchema" : {
"dataSource" : "no_metrics",
"parser" : {
"type" : "parquet",
"parseSpec" : {
"format" : "json",
"timestampSpec" : {
"column" : "time",
"format" : "yyyy-mm-dd"
},
"dimensionsSpec" : {
"dimensions" : [ "advertiser_id", "campaign_id", "payment_id", "creative_id", "website_id", "channel_id", "section_id", "zone_id", "ad_default", "topic_id", "interest_id", "inmarket_id", "audience_id", "os_id", "browser_id", "device_type", "device_id", "location_id", "age_id", "gender_id", "network_id", "merchant_cate", "userId" ],
"dimensionExclusions" : [ ],
"spatialDimensions" : [ ]
}
}
},
"metricsSpec" : [ {
"type" : "count",
"name" : "count"
}, {
"type" : "longSum",
"name" : "impression",
"fieldName" : "impression"
}, {
"type" : "longSum",
"name" : "viewable",
"fieldName" : "viewable"
}, {
"type" : "longSum",
"name" : "revenue",
"fieldName" : "revenue"
}, {
"type" : "longSum",
"name" : "proceeds",
"fieldName" : "proceeds"
}, {
"type" : "longSum",
"name" : "spent",
"fieldName" : "spent"
}, {
"type" : "longSum",
"name" : "click_fraud",
"fieldName" : "click_fraud"
}, {
"type" : "longSum",
"name" : "click",
"fieldName" : "clickdelta"
}, {
"type" : "hyperUnique",
"name" : "user_unique",
"fieldName" : "userId"
} ],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "DAY",
"queryGranularity" : {
"type" : "all"
},
"intervals" : [ "2016-06-16T00:00:00.000Z/2016-08-11T00:00:00.000Z" ]
}
},
"ioConfig" : {
"type" : "hadoop",
"inputSpec" : {
"type" : "static",
"inputFormat" : "io.druid.data.input.parquet.DruidParquetInputFormat",
"paths" : "/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet"
},
"metadataUpdateSpec" : null,
"segmentOutputPath" : "file:/Users/giaosudau/Downloads/imply-1.3.0/var/druid/segments/no_metrics"
},
"tuningConfig" : {
"type" : "hadoop",
"workingPath" : "var/druid/hadoop-tmp",
"version" : "2016-08-12T03:17:23.360Z",
"partitionsSpec" : {
"type" : "hashed",
"targetPartitionSize" : 5000000,
"maxPartitionSize" : 7500000,
"assumeGrouped" : false,
"numShards" : -1,
"partitionDimensions" : [ ]
},
"shardSpecs" : {
"2016-06-16T00:00:00.000Z" : [ ],
"2016-06-17T00:00:00.000Z" : [ ],
"2016-06-18T00:00:00.000Z" : [ ],
"2016-06-19T00:00:00.000Z" : [ ],
"2016-06-20T00:00:00.000Z" : [ ],
"2016-06-21T00:00:00.000Z" : [ ],
"2016-06-22T00:00:00.000Z" : [ ],
"2016-06-23T00:00:00.000Z" : [ ],
"2016-06-24T00:00:00.000Z" : [ ],
"2016-06-25T00:00:00.000Z" : [ ],
"2016-06-26T00:00:00.000Z" : [ ],
"2016-06-27T00:00:00.000Z" : [ ],
"2016-06-28T00:00:00.000Z" : [ ],
"2016-06-29T00:00:00.000Z" : [ ],
"2016-06-30T00:00:00.000Z" : [ ],
"2016-07-01T00:00:00.000Z" : [ ],
"2016-07-02T00:00:00.000Z" : [ ],
"2016-07-03T00:00:00.000Z" : [ ],
"2016-07-04T00:00:00.000Z" : [ ],
"2016-07-05T00:00:00.000Z" : [ ],
"2016-07-06T00:00:00.000Z" : [ ],
"2016-07-07T00:00:00.000Z" : [ ],
"2016-07-08T00:00:00.000Z" : [ ],
"2016-07-09T00:00:00.000Z" : [ ],
"2016-07-10T00:00:00.000Z" : [ ],
"2016-07-11T00:00:00.000Z" : [ ],
"2016-07-12T00:00:00.000Z" : [ ],
"2016-07-13T00:00:00.000Z" : [ ],
"2016-07-14T00:00:00.000Z" : [ ],
"2016-07-15T00:00:00.000Z" : [ ],
"2016-07-16T00:00:00.000Z" : [ ],
"2016-07-17T00:00:00.000Z" : [ ],
"2016-07-18T00:00:00.000Z" : [ ],
"2016-07-19T00:00:00.000Z" : [ ],
"2016-07-20T00:00:00.000Z" : [ ],
"2016-07-21T00:00:00.000Z" : [ ],
"2016-07-22T00:00:00.000Z" : [ ],
"2016-07-23T00:00:00.000Z" : [ ],
"2016-07-24T00:00:00.000Z" : [ ],
"2016-07-25T00:00:00.000Z" : [ ],
"2016-07-26T00:00:00.000Z" : [ ],
"2016-07-27T00:00:00.000Z" : [ ],
"2016-07-28T00:00:00.000Z" : [ ],
"2016-07-29T00:00:00.000Z" : [ ],
"2016-07-30T00:00:00.000Z" : [ ],
"2016-07-31T00:00:00.000Z" : [ ],
"2016-08-01T00:00:00.000Z" : [ ],
"2016-08-02T00:00:00.000Z" : [ ],
"2016-08-03T00:00:00.000Z" : [ ],
"2016-08-04T00:00:00.000Z" : [ ],
"2016-08-05T00:00:00.000Z" : [ ],
"2016-08-06T00:00:00.000Z" : [ ],
"2016-08-07T00:00:00.000Z" : [ ],
"2016-08-08T00:00:00.000Z" : [ ],
"2016-08-09T00:00:00.000Z" : [ ],
"2016-08-10T00:00:00.000Z" : [ ]
},
"indexSpec" : {
"bitmap" : {
"type" : "concise"
},
"dimensionCompression" : null,
"metricCompression" : null
},
"maxRowsInMemory" : 75000,
"leaveIntermediate" : false,
"cleanupOnFailure" : true,
"overwriteFiles" : false,
"ignoreInvalidRows" : false,
"jobProperties" : {
"mapreduce.map.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8",
"mapreduce.reduce.java.opts" : "-Duser.timezone=UTC -Dfile.encoding=UTF-8"
},
"combineText" : false,
"useCombiner" : false,
"buildV9Directly" : false,
"numBackgroundPersistThreads" : 0
},
"uniqueId" : "09396ba7b7364e04ae0567ab43bca414"
}
}
2016-08-12T03:17:33,006 INFO [task-runner-0-priority-0] io.druid.indexing.common.task.HadoopIndexTask - Starting a hadoop index generator job...
2016-08-12T03:17:33,020 INFO [task-runner-0-priority-0] io.druid.indexer.path.StaticPathSpec - Adding paths[/Users/giaosudau/FACT_AD_STATS_DAILY/time=2016-07-16/network_id=31713/part-r-00000-5e5c7291-e1e1-462d-9cc6-7ef2d5be892f.snappy.parquet]
2016-08-12T03:17:33,026 INFO [task-runner-0-priority-0] io.druid.indexer.HadoopDruidIndexerJob - No metadataStorageUpdaterJob set in the config. This is cool if you are running a hadoop index task, otherwise nothing will be uploaded to database.
2016-08-12T03:17:33,059 ERROR [task-runner-0-priority-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Exception while running task[HadoopIndexTask{id=index_hadoop_no_metrics_2016-08-12T03:17:23.355Z, type=index_hadoop, dataSource=no_metrics}]
java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?]
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:204) ~[druid-indexing-service-0.9.1.1.jar:0.9.1.1]
at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:208) ~[druid-indexing-service-0.9.1.1.jar:0.9.1.1]
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436) [druid-indexing-service-0.9.1.1.jar:0.9.1.1]
at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408) [druid-indexing-service-0.9.1.1.jar:0.9.1.1]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_77]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_77]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_77]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_77]
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_77]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_77]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_77]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_77]
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201) ~[druid-indexing-service-0.9.1.1.jar:0.9.1.1]
... 7 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: No buckets?? seems there is no data to index.
at io.druid.indexer.IndexGeneratorJob.run(IndexGeneratorJob.java:211) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
at io.druid.indexer.JobHelper.runJobs(JobHelper.java:323) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
at io.druid.indexer.HadoopDruidIndexerJob.run(HadoopDruidIndexerJob.java:94) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
at io.druid.indexing.common.task.HadoopIndexTask$HadoopIndexGeneratorInnerProcessing.runTask(HadoopIndexTask.java:261) ~[druid-indexing-service-0.9.1.1.jar:0.9.1.1]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_77]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_77]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_77]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_77]
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201) ~[druid-indexing-service-0.9.1.1.jar:0.9.1.1]
... 7 more
Caused by: java.lang.RuntimeException: No buckets?? seems there is no data to index.
at io.druid.indexer.IndexGeneratorJob.run(IndexGeneratorJob.java:172) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
at io.druid.indexer.JobHelper.runJobs(JobHelper.java:323) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
at io.druid.indexer.HadoopDruidIndexerJob.run(HadoopDruidIndexerJob.java:94) ~[druid-indexing-hadoop-0.9.1.1.jar:0.9.1.1]
at io.druid.indexing.common.task.HadoopIndexTask$HadoopIndexGeneratorInnerProcessing.runTask(HadoopIndexTask.java:261) ~[druid-indexing-service-0.9.1.1.jar:0.9.1.1]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_77]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_77]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_77]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_77]
at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:201) ~[druid-indexing-service-0.9.1.1.jar:0.9.1.1]
... 7 more
2016-08-12T03:17:33,071 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_hadoop_no_metrics_2016-08-12T03:17:23.355Z] status changed to [FAILED].
2016-08-12T03:17:33,077 INFO [task-runner-0-priority-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: {
"id" : "index_hadoop_no_metrics_2016-08-12T03:17:23.355Z",
"status" : "FAILED",
"duration" : 6102
}
2016-08-12T03:17:33,082 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.server.coordination.AbstractDataSegmentAnnouncer.stop()] on object[io.druid.server.coordination.BatchDataSegmentAnnouncer@421d54b3].
2016-08-12T03:17:33,082 INFO [main] io.druid.server.coordination.AbstractDataSegmentAnnouncer - Stopping class io.druid.server.coordination.BatchDataSegmentAnnouncer with config[io.druid.server.initialization.ZkPathsConfig@22e2266d]
2016-08-12T03:17:33,082 INFO [main] io.druid.curator.announcement.Announcer - unannouncing [/druid/announcements/192.168.3.30:8100]
2016-08-12T03:17:33,093 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.server.listener.announcer.ListenerResourceAnnouncer.stop()] on object[io.druid.query.lookup.LookupResourceListenerAnnouncer@5b8572df].
2016-08-12T03:17:33,093 INFO [main] io.druid.curator.announcement.Announcer - unannouncing [/druid/listeners/lookups/__default/192.168.3.30:8100]
2016-08-12T03:17:33,094 INFO [main] io.druid.server.listener.announcer.ListenerResourceAnnouncer - Unannouncing start time on [/druid/listeners/lookups/__default/192.168.3.30:8100]
2016-08-12T03:17:33,095 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.query.lookup.LookupReferencesManager.stop()] on object[io.druid.query.lookup.LookupReferencesManager@37c5284a].
2016-08-12T03:17:33,095 INFO [main] io.druid.query.lookup.LookupReferencesManager - Stopping lookup factory references manager
2016-08-12T03:17:33,099 INFO [main] org.eclipse.jetty.server.ServerConnector - Stopped ServerConnector@794eeaf8{HTTP/1.1}{0.0.0.0:8100}
2016-08-12T03:17:33,100 INFO [main] org.eclipse.jetty.server.handler.ContextHandler - Stopped o.e.j.s.ServletContextHandler@15c6027d{/,null,UNAVAILABLE}
2016-08-12T03:17:33,102 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.worker.executor.ExecutorLifecycle.stop() throws java.lang.Exception] on object[io.druid.indexing.worker.executor.ExecutorLifecycle@54089484].
2016-08-12T03:17:33,103 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.indexing.overlord.ThreadPoolTaskRunner.stop()] on object[io.druid.indexing.overlord.ThreadPoolTaskRunner@28ee0a3c].
2016-08-12T03:17:33,103 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.discovery.ServerDiscoverySelector.stop() throws java.io.IOException] on object[io.druid.curator.discovery.ServerDiscoverySelector@6c479fdf].
2016-08-12T03:17:33,106 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.announcement.Announcer.stop()] on object[io.druid.curator.announcement.Announcer@66de00f2].
2016-08-12T03:17:33,106 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.curator.discovery.ServerDiscoverySelector.stop() throws java.io.IOException] on object[io.druid.curator.discovery.ServerDiscoverySelector@767599a7].
2016-08-12T03:17:33,106 INFO [main] io.druid.curator.CuratorModule - Stopping Curator
2016-08-12T03:17:33,107 INFO [Curator-Framework-0] org.apache.curator.framework.imps.CuratorFrameworkImpl - backgroundOperationsLoop exiting
2016-08-12T03:17:33,108 INFO [main] org.apache.zookeeper.ZooKeeper - Session: 0x1567cac9a64000b closed
2016-08-12T03:17:33,108 INFO [main-EventThread] org.apache.zookeeper.ClientCnxn - EventThread shut down for session: 0x1567cac9a64000b
2016-08-12T03:17:33,108 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.http.client.NettyHttpClient.stop()] on object[com.metamx.http.client.NettyHttpClient@5afbd567].
2016-08-12T03:17:33,115 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.metrics.MonitorScheduler.stop()] on object[com.metamx.metrics.MonitorScheduler@30839e44].
2016-08-12T03:17:33,115 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void com.metamx.emitter.service.ServiceEmitter.close() throws java.io.IOException] on object[com.metamx.emitter.service.ServiceEmitter@10a98392].
2016-08-12T03:17:33,115 INFO [main] com.metamx.emitter.core.LoggingEmitter - Close: started [false]
2016-08-12T03:17:33,115 INFO [main] com.metamx.common.lifecycle.Lifecycle$AnnotationBasedHandler - Invoking stop method[public void io.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner.stop()] on object[io.druid.initialization.Log4jShutterDownerModule$Log4jShutterDowner@138aa3cc].
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment