Skip to content

Embed URL

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP
Hive server + test running on CDH 4.1.3 (client) and 4.1.1 (Hadoop server).
Connected to the daemon. Dispatching Build{id=a33a8fb6-769b-432e-a5c4-a869bea86416.1, currentDir=E:\work\i21\spring-hadoop\head} request.
The client will now receive all logging from the daemon (pid: 9532). The daemon log file: C:\Users\costin\.gradle\daemon\1.3\daemon-9532.out.log
Executing build with daemon context: DefaultDaemonContext[uid=563a00d2-9a54-4442-a25a-883cead43ebd,javaHome=E:\jvm\jdk6,daemonRegistryDir=C:\Users\costin\.gradle\daemon,pid=9532,idleTimeout=10800000,daemonOpts=-XX:MaxPermSize=256m,-XX:+HeapDumpOnOutOfMemoryError,-Xmx1024m,-Dfile.encoding=windows-1252]
Starting Build
Applying init.gradle to add Artifactory credentials
Settings evaluated using settings file 'E:\work\i21\spring-hadoop\head\settings.gradle'.
Projects loaded. Root project using build file 'E:\work\i21\spring-hadoop\head\build.gradle'.
Included projects: [root project 'spring-data-hadoop']
Evaluating root project 'spring-data-hadoop' using build file 'E:\work\i21\spring-hadoop\head\build.gradle'.
:: loading settings :: url = jar:file:/R:/libs/gradle-1.3/lib/ivy-2.2.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
Using Cloudera CDH4 [2.0.0-mr1-cdh4.1.3]
All projects evaluated.
Selected primary tasks 'enableHiveTests', 'test'
Tasks to be executed: [task ':enableHiveTests', task ':compileJava', task ':processResources', task ':classes', task ':compileTestJava', task ':processTestResources', task ':testClasses', task ':test']
:enableHiveTests
Task ':enableHiveTests' has not declared any outputs, assuming that it is out-of-date.
:compileJava
Compiling with JDK 6 Java compiler API.
:processResources
:classes
Skipping task ':classes' as it has no actions.
:compileTestJava
Compiling with JDK 6 Java compiler API.
:processTestResources
:testClasses
Skipping task ':testClasses' as it has no actions.
:test
Running single tests with pattern: [**/BasicHiveTest*.class]
Skipping [Pig HBase WebHdfs] Tests
Starting process 'Gradle Worker 1'. Working directory: E:\work\i21\spring-hadoop\head Command: E:\jvm\jdk6\bin\java.exe -Dinput.path=build/classes/test/input -Djava.security.manager=jarjar.org.gradle.process.internal.child.BootstrapSecurityManager -Doutput.path=build/classes/test/output -Dfile.encoding=windows-1252 -ea -cp C:\Users\costin\.gradle\caches\1.3\workerMain\gradle-worker.jar jarjar.org.gradle.process.internal.launcher.GradleWorkerMain
Successfully started process 'Gradle Worker 1'
Gradle Worker 1 executing tests.
org.springframework.data.hadoop.hive.BasicHiveTest STANDARD_OUT
23:22:30,015 INFO Test worker context.TestContextManager - @TestExecutionListeners is not present for class [class org.springframework.data.hadoop.hive.BasicHiveTest]: using defaults.
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:30,135 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/hive/basic.xml]
23:22:30,309 INFO Test worker xml.XmlBeanDefinitionReader - Loading XML bean definitions from class path resource [org/springframework/data/hadoop/hadoop-ctx.xml]
23:22:30,461 INFO Test worker support.GenericApplicationContext - Refreshing org.springframework.context.support.GenericApplicationContext@561557c0: startup date [Wed Feb 13 23:22:30 EET 2013]; root of context hierarchy
23:22:30,650 INFO Test worker config.PropertyPlaceholderConfigurer - Loading properties file from class path resource [test.properties]
23:22:30,687 INFO Test worker support.DefaultListableBeanFactory - Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@2658dd2d: defining beans [ppc,hadoopFs,hadoopResourceLoader,hadoopConfiguration,cfg-init,fs-init,rl-init,org.springframework.data.hadoop.scripting.HdfsScriptRunner#0,hiveServer,props,hiveClientFactory,hiveTemplate,foo,hiveRunner,hiveDriver,hiveDs,template,org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor]; root of factory hierarchy
23:22:30,724 WARN Test worker conf.Configuration - DEPRECATED: hadoop-site.xml found in the classpath. Usage of hadoop-site.xml is deprecated. Instead use core-site.xml, mapred-site.xml and hdfs-site.xml to override properties of core-default.xml, mapred-default.xml and hdfs-default.xml respectively
23:22:30,757 WARN Test worker conf.Configuration - fs.default.name is deprecated. Instead, use fs.defaultFS
23:22:31,426 WARN Test worker util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Fudging the 'path.separator' on Win for DistributedCache to work
23:22:31,816 WARN Test worker conf.HiveConf - hive-site.xml not found on CLASSPATH
23:22:32,030 INFO Test worker config.PropertiesFactoryBean - Loading properties file from class path resource [props.properties]
23:22:32,099 INFO Test worker support.DefaultLifecycleProcessor - Starting beans in phase -2147483648
23:22:32,167 INFO pool-3-thread-1 metastore.HiveMetaStore - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23:22:32,191 INFO pool-3-thread-1 metastore.ObjectStore - ObjectStore, initialize called
23:22:32,369 ERROR pool-3-thread-1 DataNucleus.Plugin - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
23:22:32,369 ERROR pool-3-thread-1 DataNucleus.Plugin - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
23:22:32,369 ERROR pool-3-thread-1 DataNucleus.Plugin - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
23:22:32,451 INFO pool-3-thread-1 DataNucleus.Persistence - Property datanucleus.cache.level2 unknown - will be ignored
23:22:32,453 INFO pool-3-thread-1 DataNucleus.Persistence - Property javax.jdo.option.NonTransactionalRead unknown - will be ignored
23:22:32,453 INFO pool-3-thread-1 DataNucleus.Persistence - ================= Persistence Configuration ===============
23:22:32,456 INFO pool-3-thread-1 DataNucleus.Persistence - DataNucleus Persistence Factory - Vendor: "DataNucleus" Version: "2.0.3"
23:22:32,456 INFO pool-3-thread-1 DataNucleus.Persistence - DataNucleus Persistence Factory initialised for datastore URL="jdbc:derby:;databaseName=metastore_db;create=true" driver="org.apache.derby.jdbc.EmbeddedDriver" userName="APP"
23:22:32,456 INFO pool-3-thread-1 DataNucleus.Persistence - ===========================================================
23:22:34,836 INFO pool-3-thread-1 Datastore.Schema - Initialising Catalog "", Schema "APP" using "None" auto-start option
23:22:34,836 INFO pool-3-thread-1 Datastore.Schema - Catalog "", Schema "APP" initialised - managing 0 classes
23:22:34,991 INFO pool-3-thread-1 metastore.ObjectStore - Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
23:22:35,026 INFO pool-3-thread-1 DataNucleus.MetaData - Registering listener for metadata initialisation
23:22:35,053 INFO pool-3-thread-1 metastore.ObjectStore - Initialized ObjectStore
23:22:35,208 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 28, column 6 : cvc-elt.1: Cannot find the declaration of element 'jdo'. - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,259 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 338, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,263 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 385, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,265 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 407, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,268 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 442, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,271 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 479, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,275 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 520, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,279 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 561, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,282 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 602, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,286 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 647, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,290 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 692, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,293 WARN pool-3-thread-1 DataNucleus.MetaData - MetaData Parser encountered an error in file "jar:file:/C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-metastore/0.9.0-cdh4.1.3/jar/a476212e9c172da03b67466ca2f3ac798ae8bed0/hive-metastore-0.9.0-cdh4.1.3.jar!/package.jdo" at line 720, column 13 : The content of element type "class" must match "(extension*,implements*,datastore-identity?,primary-key?,inheritance?,version?,join*,foreign-key*,index*,unique*,column*,field*,property*,query*,fetch-group*,extension*)". - Please check your specification of DTD and the validity of the MetaData XML that you have specified.
23:22:35,436 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MDatabase [Table : DBS, InheritanceStrategy : new-table]
23:22:35,458 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MDatabase.parameters [Table : DATABASE_PARAMS]
23:22:35,565 INFO pool-3-thread-1 Datastore.Schema - Creating table DBS
23:22:35,649 INFO pool-3-thread-1 Datastore.Schema - Creating table DATABASE_PARAMS
23:22:35,691 INFO pool-3-thread-1 Datastore.Schema - Creating index "UNIQUE_DATABASE" in catalog "" schema ""
23:22:35,706 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "DATABASE_PARAMS_FK1" in catalog "" schema ""
23:22:35,736 INFO pool-3-thread-1 Datastore.Schema - Creating index "DATABASE_PARAMS_N49" in catalog "" schema ""
23:22:35,746 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232235710.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232235710.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:243)
at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:391)
at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:410)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getDatabase(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:382)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:403)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:309)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:279)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.<init>(HiveServer.java:132)
at org.apache.hadoop.hive.service.HiveServer$ThriftHiveProcessorFactory.getProcessor(HiveServer.java:552)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:169)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:35,864 INFO pool-3-thread-1 DataNucleus.MetaData - Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MDatabase
23:22:35,905 INFO pool-3-thread-1 Datastore.Schema - Creating table SEQUENCE_TABLE
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Hive history file=/tmp/costin/hive_job_log_costin_201302132322_370066675.txt
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:36,290 INFO pool-3-thread-1 exec.HiveHistory - Hive history file=/tmp/costin/hive_job_log_costin_201302132322_370066675.txt
23:22:36,471 INFO pool-3-thread-1 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323226782493664761759038.pipeout
23:22:36,513 INFO pool-3-thread-1 service.HiveServer - Running the query: set hive.fetch.output.serde = org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
23:22:36,517 INFO pool-3-thread-1 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323226782493664761759038.pipeout
23:22:36,551 INFO pool-3-thread-1 service.HiveServer - Running the query: drop table if exists testHiveDriverTable
23:22:36,557 INFO pool-3-thread-1 ql.Driver - <PERFLOG method=Driver.run>
23:22:36,557 INFO pool-3-thread-1 ql.Driver - <PERFLOG method=TimeToSubmit>
23:22:36,557 INFO pool-3-thread-1 ql.Driver - <PERFLOG method=compile>
23:22:36,585 INFO pool-3-thread-1 parse.ParseDriver - Parsing command: drop table if exists testHiveDriverTable
23:22:36,691 INFO pool-3-thread-1 parse.ParseDriver - Parse Completed
23:22:36,722 INFO pool-3-thread-1 metastore.HiveMetaStore - 0: get_table : db=default tbl=testHiveDriverTable
23:22:36,723 INFO pool-3-thread-1 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_table : db=default tbl=testHiveDriverTable
23:22:36,726 INFO pool-3-thread-1 metastore.HiveMetaStore - 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23:22:36,727 INFO pool-3-thread-1 metastore.ObjectStore - ObjectStore, initialize called
23:22:36,727 INFO pool-3-thread-1 metastore.ObjectStore - Initialized ObjectStore
23:22:36,728 INFO pool-3-thread-1 DataNucleus.Datastore - The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
23:22:36,728 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MColumnDescriptor [Table : CDS, InheritanceStrategy : new-table]
23:22:36,729 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MSerDeInfo [Table : SERDES, InheritanceStrategy : new-table]
23:22:36,729 INFO pool-3-thread-1 DataNucleus.Datastore - The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
23:22:36,729 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MStorageDescriptor [Table : SDS, InheritanceStrategy : new-table]
23:22:36,729 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Class : org.apache.hadoop.hive.metastore.model.MTable [Table : TBLS, InheritanceStrategy : new-table]
23:22:36,729 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MSerDeInfo.parameters [Table : SERDE_PARAMS]
23:22:36,731 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.parameters [Table : TABLE_PARAMS]
23:22:36,734 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MTable.partitionKeys [Table : PARTITION_KEYS]
23:22:36,736 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.bucketCols [Table : BUCKETING_COLS]
23:22:36,739 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.parameters [Table : SD_PARAMS]
23:22:36,739 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MStorageDescriptor.sortCols [Table : SORT_COLS]
23:22:36,740 INFO pool-3-thread-1 DataNucleus.Persistence - Managing Persistence of Field : org.apache.hadoop.hive.metastore.model.MColumnDescriptor.cols [Table : COLUMNS_V2]
23:22:36,749 INFO pool-3-thread-1 Datastore.Schema - Creating table SERDES
23:22:36,787 INFO pool-3-thread-1 Datastore.Schema - Creating table TBLS
23:22:36,832 INFO pool-3-thread-1 Datastore.Schema - Creating table SDS
23:22:36,878 INFO pool-3-thread-1 Datastore.Schema - Creating table CDS
23:22:36,912 INFO pool-3-thread-1 Datastore.Schema - Creating table SERDE_PARAMS
23:22:36,951 INFO pool-3-thread-1 Datastore.Schema - Creating table TABLE_PARAMS
23:22:36,991 INFO pool-3-thread-1 Datastore.Schema - Creating table SD_PARAMS
23:22:37,028 INFO pool-3-thread-1 Datastore.Schema - Creating table SORT_COLS
23:22:37,064 INFO pool-3-thread-1 Datastore.Schema - Creating table COLUMNS_V2
23:22:37,099 INFO pool-3-thread-1 Datastore.Schema - Creating table BUCKETING_COLS
23:22:37,130 INFO pool-3-thread-1 Datastore.Schema - Creating table PARTITION_KEYS
23:22:37,157 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "TBLS_FK2" in catalog "" schema ""
23:22:37,177 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "TBLS_FK1" in catalog "" schema ""
23:22:37,197 INFO pool-3-thread-1 Datastore.Schema - Creating index "TBLS_N50" in catalog "" schema ""
23:22:37,202 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237160.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237160.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,204 INFO pool-3-thread-1 Datastore.Schema - Creating index "UniqueTable" in catalog "" schema ""
23:22:37,234 INFO pool-3-thread-1 Datastore.Schema - Creating index "TBLS_N49" in catalog "" schema ""
23:22:37,240 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237180.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237180.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,245 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "SDS_FK1" in catalog "" schema ""
23:22:37,264 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "SDS_FK2" in catalog "" schema ""
23:22:37,285 INFO pool-3-thread-1 Datastore.Schema - Creating index "SDS_N50" in catalog "" schema ""
23:22:37,290 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237250.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237250.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,292 INFO pool-3-thread-1 Datastore.Schema - Creating index "SDS_N49" in catalog "" schema ""
23:22:37,298 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237270.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237270.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,300 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "SERDE_PARAMS_FK1" in catalog "" schema ""
23:22:37,314 INFO pool-3-thread-1 Datastore.Schema - Creating index "SERDE_PARAMS_N49" in catalog "" schema ""
23:22:37,318 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237300.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237300.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,319 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "TABLE_PARAMS_FK1" in catalog "" schema ""
23:22:37,362 INFO pool-3-thread-1 Datastore.Schema - Creating index "TABLE_PARAMS_N49" in catalog "" schema ""
23:22:37,366 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237320.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237320.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,367 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "SD_PARAMS_FK1" in catalog "" schema ""
23:22:37,407 INFO pool-3-thread-1 Datastore.Schema - Creating index "SD_PARAMS_N49" in catalog "" schema ""
23:22:37,413 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237370.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237370.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,415 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "SORT_COLS_FK1" in catalog "" schema ""
23:22:37,444 INFO pool-3-thread-1 Datastore.Schema - Creating index "SORT_COLS_N49" in catalog "" schema ""
23:22:37,451 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237420.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237420.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,453 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "COLUMNS_V2_FK1" in catalog "" schema ""
23:22:37,481 INFO pool-3-thread-1 Datastore.Schema - Creating index "COLUMNS_V2_N49" in catalog "" schema ""
23:22:37,490 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237460.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237460.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,492 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "BUCKETING_COLS_FK1" in catalog "" schema ""
23:22:37,525 INFO pool-3-thread-1 Datastore.Schema - Creating index "BUCKETING_COLS_N49" in catalog "" schema ""
23:22:37,530 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237500.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237500.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,533 INFO pool-3-thread-1 Datastore.Schema - Creating foreign key constraint : "PARTITION_KEYS_FK1" in catalog "" schema ""
23:22:37,564 INFO pool-3-thread-1 Datastore.Schema - Creating index "PARTITION_KEYS_N49" in catalog "" schema ""
23:22:37,570 WARN pool-3-thread-1 DataNucleus.Datastore - SQL Warning : The new index is a duplicate of an existing index: SQL130213232237540.
java.sql.SQLWarning: The new index is a duplicate of an existing index: SQL130213232237540.
at org.apache.derby.iapi.error.StandardException.newWarningCommon(Unknown Source)
at org.apache.derby.iapi.error.StandardException.newWarning(Unknown Source)
at org.apache.derby.impl.sql.execute.CreateIndexConstantAction.executeConstantAction(Unknown Source)
at org.apache.derby.impl.sql.execute.MiscResultSet.open(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.executeStmt(Unknown Source)
at org.apache.derby.impl.sql.GenericPreparedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.executeStatement(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.derby.impl.jdbc.EmbedStatement.execute(Unknown Source)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.apache.commons.dbcp.DelegatingStatement.execute(DelegatingStatement.java:264)
at org.datanucleus.store.rdbms.table.AbstractTable.executeDdlStatement(AbstractTable.java:730)
at org.datanucleus.store.rdbms.table.TableImpl.createIndices(TableImpl.java:652)
at org.datanucleus.store.rdbms.table.TableImpl.createConstraints(TableImpl.java:434)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:2768)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.addClassTablesAndValidate(RDBMSStoreManager.java:2503)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2148)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:113)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:986)
at org.datanucleus.store.rdbms.RDBMSStoreManager.addClasses(RDBMSStoreManager.java:952)
at org.datanucleus.store.AbstractStoreManager.addClass(AbstractStoreManager.java:919)
at org.datanucleus.store.mapped.MappedStoreManager.getDatastoreClass(MappedStoreManager.java:356)
at org.datanucleus.store.rdbms.query.legacy.ExtentHelper.getExtent(ExtentHelper.java:48)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getExtent(RDBMSStoreManager.java:1332)
at org.datanucleus.ObjectManagerImpl.getExtent(ObjectManagerImpl.java:4149)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compileCandidates(JDOQLQueryCompiler.java:411)
at org.datanucleus.store.rdbms.query.legacy.QueryCompiler.executionCompile(QueryCompiler.java:312)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQueryCompiler.compile(JDOQLQueryCompiler.java:225)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.compileInternal(JDOQLQuery.java:175)
at org.datanucleus.store.query.Query.executeQuery(Query.java:1628)
at org.datanucleus.store.rdbms.query.legacy.JDOQLQuery.executeQuery(JDOQLQuery.java:245)
at org.datanucleus.store.query.Query.executeWithArray(Query.java:1499)
at org.datanucleus.jdo.JDOQuery.execute(JDOQuery.java:266)
at org.apache.hadoop.hive.metastore.ObjectStore.getMTable(ObjectStore.java:787)
at org.apache.hadoop.hive.metastore.ObjectStore.getTable(ObjectStore.java:721)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.hive.metastore.RetryingRawStore.invoke(RetryingRawStore.java:111)
at $Proxy24.getTable(Unknown Source)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1210)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeDropTable(DDLSemanticAnalyzer.java:706)
at org.apache.hadoop.hive.ql.parse.DDLSemanticAnalyzer.analyzeInternal(DDLSemanticAnalyzer.java:212)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:246)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:432)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:337)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:892)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,636 INFO pool-3-thread-1 ql.Driver - Semantic Analysis Completed
23:22:37,650 INFO pool-3-thread-1 ql.Driver - Returning Hive schema: Schema(fieldSchemas:null, properties:null)
23:22:37,650 INFO pool-3-thread-1 ql.Driver - </PERFLOG method=compile start=1360790556557 end=1360790557650 duration=1093>
23:22:37,650 INFO pool-3-thread-1 ql.Driver - <PERFLOG method=Driver.execute>
23:22:37,650 INFO pool-3-thread-1 ql.Driver - Starting command: drop table if exists testHiveDriverTable
23:22:37,670 INFO pool-3-thread-1 ql.Driver - </PERFLOG method=TimeToSubmit start=1360790556557 end=1360790557670 duration=1113>
23:22:37,694 INFO pool-3-thread-1 metastore.HiveMetaStore - 0: get_table : db=default tbl=testHiveDriverTable
23:22:37,694 INFO pool-3-thread-1 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_table : db=default tbl=testHiveDriverTable
23:22:37,698 ERROR pool-3-thread-1 metadata.Hive - NoSuchObjectException(message:default.testHiveDriverTable table not found)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1212)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:749)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:889)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:831)
at org.apache.hadoop.hive.ql.exec.DDLTask.dropTable(DDLTask.java:3021)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:268)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1317)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1103)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:936)
at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:192)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:630)
at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:618)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:32)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
23:22:37,699 INFO pool-3-thread-1 metastore.HiveMetaStore - 0: get_table : db=default tbl=testHiveDriverTable
23:22:37,699 INFO pool-3-thread-1 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_table : db=default tbl=testHiveDriverTable
23:22:37,704 INFO pool-3-thread-1 ql.Driver - </PERFLOG method=Driver.execute start=1360790557650 end=1360790557704 duration=54>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
OK
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:37,704 INFO pool-3-thread-1 ql.Driver - OK
23:22:37,704 INFO pool-3-thread-1 ql.Driver - <PERFLOG method=releaseLocks>
23:22:37,704 INFO pool-3-thread-1 ql.Driver - </PERFLOG method=releaseLocks start=1360790557704 end=1360790557704 duration=0>
23:22:37,705 INFO pool-3-thread-1 ql.Driver - </PERFLOG method=Driver.run start=1360790556557 end=1360790557704 duration=1147>
23:22:37,705 INFO pool-3-thread-1 service.HiveServer - Returning schema: Schema(fieldSchemas:null, properties:null)
23:22:37,715 INFO pool-3-thread-1 ql.Driver - <PERFLOG method=releaseLocks>
23:22:37,715 INFO pool-3-thread-1 ql.Driver - </PERFLOG method=releaseLocks start=1360790557715 end=1360790557715 duration=0>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Hive history file=/tmp/costin/hive_job_log_costin_201302132322_340885317.txt
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:37,723 INFO pool-3-thread-2 exec.HiveHistory - Hive history file=/tmp/costin/hive_job_log_costin_201302132322_340885317.txt
23:22:37,767 INFO pool-3-thread-2 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323223166667414007009677.pipeout
23:22:37,768 INFO pool-3-thread-2 service.HiveServer - Running the query: set hive.fetch.output.serde = org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
23:22:37,768 INFO pool-3-thread-2 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323223166667414007009677.pipeout
23:22:37,770 INFO pool-3-thread-2 service.HiveServer - Running the query: create table testHiveDriverTable (key int, value string)
23:22:37,770 INFO pool-3-thread-2 ql.Driver - <PERFLOG method=Driver.run>
23:22:37,770 INFO pool-3-thread-2 ql.Driver - <PERFLOG method=TimeToSubmit>
23:22:37,771 INFO pool-3-thread-2 ql.Driver - <PERFLOG method=compile>
23:22:37,771 INFO pool-3-thread-2 parse.ParseDriver - Parsing command: create table testHiveDriverTable (key int, value string)
23:22:37,776 INFO pool-3-thread-2 parse.ParseDriver - Parse Completed
23:22:37,799 INFO pool-3-thread-2 parse.SemanticAnalyzer - Starting Semantic Analysis
23:22:37,801 INFO pool-3-thread-2 parse.SemanticAnalyzer - Creating table testHiveDriverTable position=13
23:22:37,803 INFO pool-3-thread-2 ql.Driver - Semantic Analysis Completed
23:22:37,803 INFO pool-3-thread-2 ql.Driver - Returning Hive schema: Schema(fieldSchemas:null, properties:null)
23:22:37,803 INFO pool-3-thread-2 ql.Driver - </PERFLOG method=compile start=1360790557771 end=1360790557803 duration=32>
23:22:37,803 INFO pool-3-thread-2 ql.Driver - <PERFLOG method=Driver.execute>
23:22:37,803 INFO pool-3-thread-2 ql.Driver - Starting command: create table testHiveDriverTable (key int, value string)
23:22:37,805 INFO pool-3-thread-2 ql.Driver - </PERFLOG method=TimeToSubmit start=1360790557770 end=1360790557805 duration=35>
23:22:37,806 INFO pool-3-thread-2 exec.DDLTask - Default to LazySimpleSerDe for table testHiveDriverTable
23:22:37,810 INFO pool-3-thread-2 hive.log - DDL: struct testHiveDriverTable { i32 key, string value}
23:22:37,826 INFO pool-3-thread-2 metastore.HiveMetaStore - 1: create_table: db=default tbl=testHiveDriverTable
23:22:37,827 INFO pool-3-thread-2 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=create_table: db=default tbl=testHiveDriverTable
23:22:37,827 INFO pool-3-thread-2 metastore.HiveMetaStore - 1: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23:22:37,828 INFO pool-3-thread-2 metastore.ObjectStore - ObjectStore, initialize called
23:22:37,828 INFO pool-3-thread-2 metastore.ObjectStore - Initialized ObjectStore
23:22:37,954 INFO pool-3-thread-2 DataNucleus.MetaData - Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MColumnDescriptor
23:22:37,955 INFO pool-3-thread-2 DataNucleus.MetaData - Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MSerDeInfo
23:22:37,955 INFO pool-3-thread-2 DataNucleus.MetaData - Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MStorageDescriptor
23:22:37,955 INFO pool-3-thread-2 DataNucleus.MetaData - Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MTable
23:22:37,955 INFO pool-3-thread-2 DataNucleus.MetaData - Listener found initialisation for persistable class org.apache.hadoop.hive.metastore.model.MFieldSchema
23:22:38,144 INFO pool-3-thread-2 ql.Driver - </PERFLOG method=Driver.execute start=1360790557803 end=1360790558144 duration=341>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
OK
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,144 INFO pool-3-thread-2 ql.Driver - OK
23:22:38,144 INFO pool-3-thread-2 ql.Driver - <PERFLOG method=releaseLocks>
23:22:38,144 INFO pool-3-thread-2 ql.Driver - </PERFLOG method=releaseLocks start=1360790558144 end=1360790558144 duration=0>
23:22:38,144 INFO pool-3-thread-2 ql.Driver - </PERFLOG method=Driver.run start=1360790557770 end=1360790558144 duration=374>
23:22:38,145 INFO pool-3-thread-2 service.HiveServer - Returning schema: Schema(fieldSchemas:null, properties:null)
23:22:38,145 INFO pool-3-thread-2 ql.Driver - <PERFLOG method=releaseLocks>
23:22:38,145 INFO pool-3-thread-2 ql.Driver - </PERFLOG method=releaseLocks start=1360790558145 end=1360790558145 duration=0>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Hive history file=/tmp/costin/hive_job_log_costin_201302132322_520661410.txt
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,151 INFO pool-3-thread-3 exec.HiveHistory - Hive history file=/tmp/costin/hive_job_log_costin_201302132322_520661410.txt
23:22:38,189 INFO pool-3-thread-3 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323224003860105625111360.pipeout
23:22:38,190 INFO pool-3-thread-3 service.HiveServer - Running the query: set hive.fetch.output.serde = org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
23:22:38,190 INFO pool-3-thread-3 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323224003860105625111360.pipeout
23:22:38,191 INFO pool-3-thread-3 service.HiveServer - Running the query: show tables
23:22:38,191 INFO pool-3-thread-3 ql.Driver - <PERFLOG method=Driver.run>
23:22:38,191 INFO pool-3-thread-3 ql.Driver - <PERFLOG method=TimeToSubmit>
23:22:38,192 INFO pool-3-thread-3 ql.Driver - <PERFLOG method=compile>
23:22:38,192 INFO pool-3-thread-3 parse.ParseDriver - Parsing command: show tables
23:22:38,193 INFO pool-3-thread-3 parse.ParseDriver - Parse Completed
23:22:38,195 INFO pool-3-thread-3 ql.Driver - Semantic Analysis Completed
23:22:38,200 INFO pool-3-thread-3 ql.Driver - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
23:22:38,201 INFO pool-3-thread-3 ql.Driver - </PERFLOG method=compile start=1360790558192 end=1360790558201 duration=9>
23:22:38,201 INFO pool-3-thread-3 ql.Driver - <PERFLOG method=Driver.execute>
23:22:38,201 INFO pool-3-thread-3 ql.Driver - Starting command: show tables
23:22:38,201 INFO pool-3-thread-3 ql.Driver - </PERFLOG method=TimeToSubmit start=1360790558191 end=1360790558201 duration=10>
23:22:38,203 INFO pool-3-thread-3 metastore.HiveMetaStore - 2: get_database: default
23:22:38,203 INFO pool-3-thread-3 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_database: default
23:22:38,203 INFO pool-3-thread-3 metastore.HiveMetaStore - 2: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23:22:38,204 INFO pool-3-thread-3 metastore.ObjectStore - ObjectStore, initialize called
23:22:38,204 INFO pool-3-thread-3 metastore.ObjectStore - Initialized ObjectStore
23:22:38,207 INFO pool-3-thread-3 metastore.HiveMetaStore - 2: get_tables: db=default pat=.*
23:22:38,208 INFO pool-3-thread-3 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
23:22:38,352 INFO pool-3-thread-3 ql.Driver - </PERFLOG method=Driver.execute start=1360790558201 end=1360790558352 duration=151>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
OK
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,353 INFO pool-3-thread-3 ql.Driver - OK
23:22:38,353 INFO pool-3-thread-3 ql.Driver - <PERFLOG method=releaseLocks>
23:22:38,353 INFO pool-3-thread-3 ql.Driver - </PERFLOG method=releaseLocks start=1360790558353 end=1360790558353 duration=0>
23:22:38,353 INFO pool-3-thread-3 ql.Driver - </PERFLOG method=Driver.run start=1360790558191 end=1360790558353 duration=162>
23:22:38,353 INFO pool-3-thread-3 service.HiveServer - Returning schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
23:22:38,357 INFO pool-3-thread-3 ql.Driver - <PERFLOG method=releaseLocks>
23:22:38,357 INFO pool-3-thread-3 ql.Driver - </PERFLOG method=releaseLocks start=1360790558357 end=1360790558357 duration=0>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Hive history file=/tmp/costin/hive_job_log_costin_201302132322_1529609753.txt
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,362 INFO pool-3-thread-4 exec.HiveHistory - Hive history file=/tmp/costin/hive_job_log_costin_201302132322_1529609753.txt
23:22:38,398 INFO pool-3-thread-4 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323226761737057166769045.pipeout
23:22:38,399 INFO pool-3-thread-4 service.HiveServer - Running the query: set hive.fetch.output.serde = org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe
23:22:38,399 INFO pool-3-thread-4 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323226761737057166769045.pipeout
23:22:38,400 INFO pool-3-thread-4 service.HiveServer - Running the query: select count(1) from testHiveDriverTable
23:22:38,400 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=Driver.run>
23:22:38,400 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=TimeToSubmit>
23:22:38,401 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=compile>
23:22:38,401 INFO pool-3-thread-4 parse.ParseDriver - Parsing command: select count(1) from testHiveDriverTable
23:22:38,414 INFO pool-3-thread-4 parse.ParseDriver - Parse Completed
23:22:38,415 INFO pool-3-thread-4 parse.SemanticAnalyzer - Starting Semantic Analysis
23:22:38,416 INFO pool-3-thread-4 parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
23:22:38,416 INFO pool-3-thread-4 parse.SemanticAnalyzer - Get metadata for source tables
23:22:38,416 INFO pool-3-thread-4 metastore.HiveMetaStore - 3: get_table : db=default tbl=testHiveDriverTable
23:22:38,416 INFO pool-3-thread-4 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_table : db=default tbl=testHiveDriverTable
23:22:38,417 INFO pool-3-thread-4 metastore.HiveMetaStore - 3: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23:22:38,417 INFO pool-3-thread-4 metastore.ObjectStore - ObjectStore, initialize called
23:22:38,418 INFO pool-3-thread-4 metastore.ObjectStore - Initialized ObjectStore
23:22:38,507 INFO pool-3-thread-4 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:22:38,508 INFO pool-3-thread-4 parse.SemanticAnalyzer - Get metadata for subqueries
23:22:38,508 INFO pool-3-thread-4 parse.SemanticAnalyzer - Get metadata for destination tables
23:22:38,597 INFO pool-3-thread-4 parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
23:22:38,600 INFO pool-3-thread-4 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:22:38,719 INFO pool-3-thread-4 ppd.OpProcFactory - Processing for FS(6)
23:22:38,719 INFO pool-3-thread-4 ppd.OpProcFactory - Processing for SEL(5)
23:22:38,719 INFO pool-3-thread-4 ppd.OpProcFactory - Processing for GBY(4)
23:22:38,720 INFO pool-3-thread-4 ppd.OpProcFactory - Processing for RS(3)
23:22:38,720 INFO pool-3-thread-4 ppd.OpProcFactory - Processing for GBY(2)
23:22:38,720 INFO pool-3-thread-4 ppd.OpProcFactory - Processing for SEL(1)
23:22:38,720 INFO pool-3-thread-4 ppd.OpProcFactory - Processing for TS(0)
23:22:38,773 INFO pool-3-thread-4 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:22:38,774 INFO pool-3-thread-4 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:22:38,774 INFO pool-3-thread-4 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:22:38,782 INFO pool-3-thread-4 physical.MetadataOnlyOptimizer - Looking for table scans where optimization is applicable
23:22:38,786 INFO pool-3-thread-4 physical.MetadataOnlyOptimizer - Found 0 metadata only table scans
23:22:38,786 INFO pool-3-thread-4 parse.SemanticAnalyzer - Completed plan generation
23:22:38,786 INFO pool-3-thread-4 ql.Driver - Semantic Analysis Completed
23:22:38,788 INFO pool-3-thread-4 ql.Driver - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:_c0, type:bigint, comment:null)], properties:null)
23:22:38,789 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=compile start=1360790558401 end=1360790558789 duration=388>
23:22:38,789 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=Driver.execute>
23:22:38,789 INFO pool-3-thread-4 ql.Driver - Starting command: select count(1) from testHiveDriverTable
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Total MapReduce jobs = 1
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,794 INFO pool-3-thread-4 ql.Driver - Total MapReduce jobs = 1
23:22:38,795 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=TimeToSubmit start=1360790558400 end=1360790558795 duration=395>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Launching Job 1 out of 1
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,795 INFO pool-3-thread-4 ql.Driver - Launching Job 1 out of 1
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Number of reduce tasks determined at compile time: 1
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,802 INFO pool-3-thread-4 exec.Task - Number of reduce tasks determined at compile time: 1
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
In order to change the average load for a reducer (in bytes):
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,802 INFO pool-3-thread-4 exec.Task - In order to change the average load for a reducer (in bytes):
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
set hive.exec.reducers.bytes.per.reducer=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,802 INFO pool-3-thread-4 exec.Task - set hive.exec.reducers.bytes.per.reducer=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
In order to limit the maximum number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,802 INFO pool-3-thread-4 exec.Task - In order to limit the maximum number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
set hive.exec.reducers.max=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,802 INFO pool-3-thread-4 exec.Task - set hive.exec.reducers.max=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
In order to set a constant number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,803 INFO pool-3-thread-4 exec.Task - In order to set a constant number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
set mapred.reduce.tasks=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:22:38,803 INFO pool-3-thread-4 exec.Task - set mapred.reduce.tasks=<number>
23:22:38,899 INFO pool-3-thread-4 exec.ExecDriver - Using org.apache.hadoop.hive.ql.io.CombineHiveInputFormat
23:22:38,904 INFO pool-3-thread-4 exec.ExecDriver - adding libjars: file:///C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-builtins/0.9.0-cdh4.1.3/jar/6a864aef6987ba91effcb7139e9127f6952be29c/hive-builtins-0.9.0-cdh4.1.3.jar
23:22:38,904 INFO pool-3-thread-4 exec.ExecDriver - Processing alias testhivedrivertable
23:22:38,904 INFO pool-3-thread-4 exec.ExecDriver - Adding input file hdfs://xxx:xxx/user/hive/warehouse/testhivedrivertable
23:22:38,904 INFO pool-3-thread-4 exec.Utilities - Content Summary not cached for hdfs://xxx:xxx/user/hive/warehouse/testhivedrivertable
23:22:39,170 INFO pool-3-thread-4 exec.ExecDriver - Changed input file to hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-22-38_401_7864697230426698665/-mr-10002/1
23:22:40,917 INFO pool-3-thread-4 exec.ExecDriver - Making Temp Directory: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-22-38_401_7864697230426698665/-ext-10001
23:22:41,544 WARN pool-3-thread-4 mapred.JobClient - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
23:23:03,516 INFO pool-3-thread-4 io.CombineHiveInputFormat - CombineHiveInputSplit creating pool for hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-22-38_401_7864697230426698665/-mr-10002/1; using filter path hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-22-38_401_7864697230426698665/-mr-10002/1
23:23:03,694 INFO pool-3-thread-4 mapred.FileInputFormat - Total input paths to process : 1
23:23:03,878 INFO pool-3-thread-4 io.CombineHiveInputFormat - number of splits 1
23:23:05,867 WARN pool-3-thread-4 conf.Configuration - fs.default.name is deprecated. Instead, use fs.defaultFS
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Starting Job = job_201212270850_1366, Tracking URL = http://0.0.0.0:50030/jobdetails.jsp?jobid=job_201212270850_1366
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:07,171 INFO pool-3-thread-4 exec.Task - Starting Job = job_201212270850_1366, Tracking URL = http://0.0.0.0:50030/jobdetails.jsp?jobid=job_201212270850_1366
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Kill Command = \usr\bin\hadoop job -Dmapred.job.tracker=sof-40612-vm03.eng.vmware.com:8021 -kill job_201212270850_1366
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:07,172 INFO pool-3-thread-4 exec.Task - Kill Command = \usr\bin\hadoop job -Dmapred.job.tracker=sof-40612-vm03.eng.vmware.com:8021 -kill job_201212270850_1366
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:15,545 INFO pool-3-thread-4 exec.Task - Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
23:23:15,842 WARN pool-3-thread-4 mapreduce.Counters - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:15,842 Stage-1 map = 0%, reduce = 0%
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:15,850 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:15,842 Stage-1 map = 0%, reduce = 0%
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:22,606 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:22,611 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:22,606 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:23,955 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:23,957 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:23,955 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:25,753 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:25,756 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:25,753 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:27,108 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:27,111 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:27,108 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:28,466 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:28,468 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:28,466 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:29,821 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:29,824 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:29,821 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:31,173 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:31,176 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:31,173 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:32,523 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:32,527 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:32,523 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:33,880 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:33,883 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:33,880 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:35,228 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:35,232 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:35,228 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:36,574 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:36,577 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:36,574 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:37,916 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:37,919 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:37,916 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
2013-02-13 23:23:39,270 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:39,274 INFO pool-3-thread-4 exec.Task - 2013-02-13 23:23:39,270 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.5 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
MapReduce Total cumulative CPU time: 4 seconds 500 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:39,356 INFO pool-3-thread-4 exec.Task - MapReduce Total cumulative CPU time: 4 seconds 500 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Ended Job = job_201212270850_1366
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:39,537 INFO pool-3-thread-4 exec.Task - Ended Job = job_201212270850_1366
23:23:39,976 INFO pool-3-thread-4 exec.FileSinkOperator - Moving tmp dir: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-22-38_401_7864697230426698665/_tmp.-ext-10001 to: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-22-38_401_7864697230426698665/_tmp.-ext-10001.intermediate
23:23:40,153 INFO pool-3-thread-4 exec.FileSinkOperator - Moving tmp dir: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-22-38_401_7864697230426698665/_tmp.-ext-10001.intermediate to: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-22-38_401_7864697230426698665/-ext-10001
23:23:40,409 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=Driver.execute start=1360790558789 end=1360790620409 duration=61620>
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
MapReduce Jobs Launched:
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:40,409 INFO pool-3-thread-4 ql.Driver - MapReduce Jobs Launched:
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Job 0: Map: 1 Reduce: 1 Cumulative CPU: 4.5 sec HDFS Read: 0 HDFS Write: 0 SUCCESS
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:40,412 INFO pool-3-thread-4 ql.Driver - Job 0: Map: 1 Reduce: 1 Cumulative CPU: 4.5 sec HDFS Read: 0 HDFS Write: 0 SUCCESS
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
Total MapReduce CPU Time Spent: 4 seconds 500 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:40,413 INFO pool-3-thread-4 ql.Driver - Total MapReduce CPU Time Spent: 4 seconds 500 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_ERROR
OK
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveConnection STANDARD_OUT
23:23:40,413 INFO pool-3-thread-4 ql.Driver - OK
23:23:40,413 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=releaseLocks>
23:23:40,414 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=releaseLocks start=1360790620413 end=1360790620414 duration=1>
23:23:40,414 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=Driver.run start=1360790558400 end=1360790620414 duration=62014>
23:23:40,415 INFO pool-3-thread-4 service.HiveServer - Returning schema: Schema(fieldSchemas:[FieldSchema(name:_c0, type:bigint, comment:null)], properties:null)
23:23:40,580 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=releaseLocks>
23:23:40,581 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=releaseLocks start=1360790620580 end=1360790620581 duration=1>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Hive history file=/tmp/costin/hive_job_log_costin_201302132323_523546990.txt
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:40,597 INFO pool-3-thread-5 exec.HiveHistory - Hive history file=/tmp/costin/hive_job_log_costin_201302132323_523546990.txt
23:23:40,639 INFO pool-3-thread-5 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323237529489004247079741.pipeout
23:23:40,640 INFO pool-3-thread-5 service.HiveServer - Running the query: create table if not exists testHiveDriverTable (key int, value string)
23:23:40,640 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=Driver.run>
23:23:40,640 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=TimeToSubmit>
23:23:40,640 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=compile>
23:23:40,640 INFO pool-3-thread-5 parse.ParseDriver - Parsing command: create table if not exists testHiveDriverTable (key int, value string)
23:23:40,641 INFO pool-3-thread-5 parse.ParseDriver - Parse Completed
23:23:40,642 INFO pool-3-thread-5 parse.SemanticAnalyzer - Starting Semantic Analysis
23:23:40,642 INFO pool-3-thread-5 parse.SemanticAnalyzer - Creating table testHiveDriverTable position=27
23:23:40,642 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: get_tables: db=default pat=testHiveDriverTable
23:23:40,643 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_tables: db=default pat=testHiveDriverTable
23:23:40,643 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23:23:40,644 INFO pool-3-thread-5 metastore.ObjectStore - ObjectStore, initialize called
23:23:40,644 INFO pool-3-thread-5 metastore.ObjectStore - Initialized ObjectStore
23:23:40,647 INFO pool-3-thread-5 ql.Driver - Semantic Analysis Completed
23:23:40,647 INFO pool-3-thread-5 ql.Driver - Returning Hive schema: Schema(fieldSchemas:null, properties:null)
23:23:40,648 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=compile start=1360790620640 end=1360790620648 duration=8>
23:23:40,648 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=Driver.execute>
23:23:40,648 INFO pool-3-thread-5 ql.Driver - Starting command: create table if not exists testHiveDriverTable (key int, value string)
23:23:40,648 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=TimeToSubmit start=1360790620640 end=1360790620648 duration=8>
23:23:40,648 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=Driver.execute start=1360790620648 end=1360790620648 duration=0>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
OK
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:40,648 INFO pool-3-thread-5 ql.Driver - OK
23:23:40,649 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=releaseLocks>
23:23:40,649 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=releaseLocks start=1360790620649 end=1360790620649 duration=0>
23:23:40,649 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=Driver.run start=1360790620640 end=1360790620649 duration=9>
23:23:40,655 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Shutting down the object store...
23:23:40,656 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Shutting down the object store...
23:23:40,656 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Metastore shutdown complete.
23:23:40,656 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Metastore shutdown complete.
23:23:40,657 ERROR pool-3-thread-5 server.TThreadPoolServer - Error occurred during processing of message.
java.lang.NullPointerException
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:34)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Hive history file=/tmp/costin/hive_job_log_costin_201302132323_1362039560.txt
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:40,659 INFO pool-3-thread-4 exec.HiveHistory - Hive history file=/tmp/costin/hive_job_log_costin_201302132323_1362039560.txt
23:23:40,696 INFO pool-3-thread-4 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323236684893525456893942.pipeout
23:23:40,697 INFO pool-3-thread-4 service.HiveServer - Running the query: show tables
23:23:40,697 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=Driver.run>
23:23:40,697 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=TimeToSubmit>
23:23:40,697 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=compile>
23:23:40,697 INFO pool-3-thread-4 parse.ParseDriver - Parsing command: show tables
23:23:40,698 INFO pool-3-thread-4 parse.ParseDriver - Parse Completed
23:23:40,699 INFO pool-3-thread-4 ql.Driver - Semantic Analysis Completed
23:23:40,702 INFO pool-3-thread-4 ql.Driver - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from deserializer)], properties:null)
23:23:40,703 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=compile start=1360790620697 end=1360790620703 duration=6>
23:23:40,703 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=Driver.execute>
23:23:40,703 INFO pool-3-thread-4 ql.Driver - Starting command: show tables
23:23:40,704 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=TimeToSubmit start=1360790620697 end=1360790620704 duration=7>
23:23:40,706 INFO pool-3-thread-4 metastore.HiveMetaStore - 3: get_database: default
23:23:40,706 INFO pool-3-thread-4 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_database: default
23:23:40,709 INFO pool-3-thread-4 metastore.HiveMetaStore - 3: get_tables: db=default pat=.*
23:23:40,709 INFO pool-3-thread-4 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_tables: db=default pat=.*
23:23:40,828 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=Driver.execute start=1360790620703 end=1360790620828 duration=125>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
OK
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:40,828 INFO pool-3-thread-4 ql.Driver - OK
23:23:40,828 INFO pool-3-thread-4 ql.Driver - <PERFLOG method=releaseLocks>
23:23:40,828 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=releaseLocks start=1360790620828 end=1360790620828 duration=0>
23:23:40,828 INFO pool-3-thread-4 ql.Driver - </PERFLOG method=Driver.run start=1360790620697 end=1360790620828 duration=131>
23:23:40,869 INFO pool-3-thread-4 mapred.FileInputFormat - Total input paths to process : 1
23:23:40,879 INFO pool-3-thread-4 metastore.HiveMetaStore - 3: Shutting down the object store...
[testhivedrivertable]
23:23:40,879 INFO pool-3-thread-4 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Shutting down the object store...
23:23:40,880 INFO pool-3-thread-4 metastore.HiveMetaStore - 3: Metastore shutdown complete.
23:23:40,880 INFO pool-3-thread-4 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Metastore shutdown complete.
23:23:40,880 ERROR pool-3-thread-4 server.TThreadPoolServer - Error occurred during processing of message.
java.lang.NullPointerException
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:34)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Hive history file=/tmp/costin/hive_job_log_costin_201302132323_815116960.txt
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:40,883 INFO pool-3-thread-5 exec.HiveHistory - Hive history file=/tmp/costin/hive_job_log_costin_201302132323_815116960.txt
23:23:40,921 INFO pool-3-thread-5 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323237117205779828450457.pipeout
23:23:40,922 INFO pool-3-thread-5 service.HiveServer - Running the query: select count(1) as cnt from testHiveDriverTable
23:23:40,922 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=Driver.run>
23:23:40,922 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=TimeToSubmit>
23:23:40,922 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=compile>
23:23:40,923 INFO pool-3-thread-5 parse.ParseDriver - Parsing command: select count(1) as cnt from testHiveDriverTable
23:23:40,923 INFO pool-3-thread-5 parse.ParseDriver - Parse Completed
23:23:40,924 INFO pool-3-thread-5 parse.SemanticAnalyzer - Starting Semantic Analysis
23:23:40,924 INFO pool-3-thread-5 parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
23:23:40,924 INFO pool-3-thread-5 parse.SemanticAnalyzer - Get metadata for source tables
23:23:40,924 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: get_table : db=default tbl=testHiveDriverTable
23:23:40,924 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_table : db=default tbl=testHiveDriverTable
23:23:40,941 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:23:40,941 INFO pool-3-thread-5 parse.SemanticAnalyzer - Get metadata for subqueries
23:23:40,941 INFO pool-3-thread-5 parse.SemanticAnalyzer - Get metadata for destination tables
23:23:41,029 INFO pool-3-thread-5 parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
23:23:41,030 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:23:41,038 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for FS(20)
23:23:41,039 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for SEL(19)
23:23:41,039 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for GBY(18)
23:23:41,039 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for RS(17)
23:23:41,039 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for GBY(16)
23:23:41,039 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for SEL(15)
23:23:41,039 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for TS(14)
23:23:41,043 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:23:41,043 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:23:41,043 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:23:41,044 INFO pool-3-thread-5 physical.MetadataOnlyOptimizer - Looking for table scans where optimization is applicable
23:23:41,045 INFO pool-3-thread-5 physical.MetadataOnlyOptimizer - Found 0 metadata only table scans
23:23:41,045 INFO pool-3-thread-5 parse.SemanticAnalyzer - Completed plan generation
23:23:41,045 INFO pool-3-thread-5 ql.Driver - Semantic Analysis Completed
23:23:41,047 INFO pool-3-thread-5 ql.Driver - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:cnt, type:bigint, comment:null)], properties:null)
23:23:41,049 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=compile start=1360790620922 end=1360790621048 duration=126>
23:23:41,049 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=Driver.execute>
23:23:41,049 INFO pool-3-thread-5 ql.Driver - Starting command: select count(1) as cnt from testHiveDriverTable
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Total MapReduce jobs = 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:41,050 INFO pool-3-thread-5 ql.Driver - Total MapReduce jobs = 1
23:23:41,050 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=TimeToSubmit start=1360790620922 end=1360790621050 duration=128>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Launching Job 1 out of 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:41,050 INFO pool-3-thread-5 ql.Driver - Launching Job 1 out of 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Number of reduce tasks determined at compile time: 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:41,058 INFO pool-3-thread-5 exec.Task - Number of reduce tasks determined at compile time: 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
In order to change the average load for a reducer (in bytes):
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:41,059 INFO pool-3-thread-5 exec.Task - In order to change the average load for a reducer (in bytes):
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
set hive.exec.reducers.bytes.per.reducer=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:41,059 INFO pool-3-thread-5 exec.Task - set hive.exec.reducers.bytes.per.reducer=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
In order to limit the maximum number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:41,059 INFO pool-3-thread-5 exec.Task - In order to limit the maximum number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
set hive.exec.reducers.max=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:41,059 INFO pool-3-thread-5 exec.Task - set hive.exec.reducers.max=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
In order to set a constant number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:41,059 INFO pool-3-thread-5 exec.Task - In order to set a constant number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
set mapred.reduce.tasks=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:23:41,059 INFO pool-3-thread-5 exec.Task - set mapred.reduce.tasks=<number>
23:23:41,144 INFO pool-3-thread-5 exec.ExecDriver - Using org.apache.hadoop.hive.ql.io.CombineHiveInputFormat
23:23:41,144 INFO pool-3-thread-5 exec.ExecDriver - adding libjars: file:///C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-builtins/0.9.0-cdh4.1.3/jar/6a864aef6987ba91effcb7139e9127f6952be29c/hive-builtins-0.9.0-cdh4.1.3.jar
23:23:41,145 INFO pool-3-thread-5 exec.ExecDriver - Processing alias testhivedrivertable
23:23:41,145 INFO pool-3-thread-5 exec.ExecDriver - Adding input file hdfs://xxx:xxx/user/hive/warehouse/testhivedrivertable
23:23:41,145 INFO pool-3-thread-5 exec.Utilities - Content Summary not cached for hdfs://xxx:xxx/user/hive/warehouse/testhivedrivertable
23:23:41,401 INFO pool-3-thread-5 exec.ExecDriver - Changed input file to hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-23-40_922_2291668700139825900/-mr-10002/1
23:23:42,745 INFO pool-3-thread-5 exec.ExecDriver - Making Temp Directory: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-23-40_922_2291668700139825900/-ext-10001
23:23:43,170 WARN pool-3-thread-5 mapred.JobClient - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
23:23:59,409 INFO pool-3-thread-5 io.CombineHiveInputFormat - CombineHiveInputSplit creating pool for hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-23-40_922_2291668700139825900/-mr-10002/1; using filter path hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-23-40_922_2291668700139825900/-mr-10002/1
23:23:59,863 INFO pool-3-thread-5 mapred.FileInputFormat - Total input paths to process : 1
23:24:00,033 INFO pool-3-thread-5 io.CombineHiveInputFormat - number of splits 1
23:24:01,960 WARN pool-3-thread-5 conf.Configuration - fs.default.name is deprecated. Instead, use fs.defaultFS
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Starting Job = job_201212270850_1367, Tracking URL = http://0.0.0.0:50030/jobdetails.jsp?jobid=job_201212270850_1367
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:03,940 INFO pool-3-thread-5 exec.Task - Starting Job = job_201212270850_1367, Tracking URL = http://0.0.0.0:50030/jobdetails.jsp?jobid=job_201212270850_1367
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Kill Command = \usr\bin\hadoop job -Dmapred.job.tracker=sof-40612-vm03.eng.vmware.com:8021 -kill job_201212270850_1367
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:03,941 INFO pool-3-thread-5 exec.Task - Kill Command = \usr\bin\hadoop job -Dmapred.job.tracker=sof-40612-vm03.eng.vmware.com:8021 -kill job_201212270850_1367
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:11,152 INFO pool-3-thread-5 exec.Task - Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
23:24:11,414 WARN pool-3-thread-5 mapreduce.Counters - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:11,414 Stage-1 map = 0%, reduce = 0%
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:11,415 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:11,414 Stage-1 map = 0%, reduce = 0%
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:19,491 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:19,494 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:19,491 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:20,841 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:20,845 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:20,841 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:22,194 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:22,196 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:22,194 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:23,540 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:23,542 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:23,540 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:24,897 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:24,900 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:24,897 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:26,250 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:26,253 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:26,250 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:27,601 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:27,604 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:27,601 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:28,955 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:28,957 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:28,955 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.35 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:30,783 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:30,786 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:30,783 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:32,138 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:32,141 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:32,138 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:33,489 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:33,492 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:33,489 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:34,841 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:34,844 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:34,841 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
2013-02-13 23:24:36,199 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:36,202 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:24:36,199 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.58 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
MapReduce Total cumulative CPU time: 4 seconds 580 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:36,286 INFO pool-3-thread-5 exec.Task - MapReduce Total cumulative CPU time: 4 seconds 580 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Ended Job = job_201212270850_1367
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:36,464 INFO pool-3-thread-5 exec.Task - Ended Job = job_201212270850_1367
23:24:36,891 INFO pool-3-thread-5 exec.FileSinkOperator - Moving tmp dir: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-23-40_922_2291668700139825900/_tmp.-ext-10001 to: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-23-40_922_2291668700139825900/_tmp.-ext-10001.intermediate
23:24:37,063 INFO pool-3-thread-5 exec.FileSinkOperator - Moving tmp dir: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-23-40_922_2291668700139825900/_tmp.-ext-10001.intermediate to: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-23-40_922_2291668700139825900/-ext-10001
23:24:37,319 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=Driver.execute start=1360790621049 end=1360790677319 duration=56270>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
MapReduce Jobs Launched:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:37,319 INFO pool-3-thread-5 ql.Driver - MapReduce Jobs Launched:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Job 0: Map: 1 Reduce: 1 Cumulative CPU: 4.58 sec HDFS Read: 0 HDFS Write: 0 SUCCESS
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:37,319 INFO pool-3-thread-5 ql.Driver - Job 0: Map: 1 Reduce: 1 Cumulative CPU: 4.58 sec HDFS Read: 0 HDFS Write: 0 SUCCESS
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
Total MapReduce CPU Time Spent: 4 seconds 580 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:37,320 INFO pool-3-thread-5 ql.Driver - Total MapReduce CPU Time Spent: 4 seconds 580 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_ERROR
OK
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForInt STANDARD_OUT
23:24:37,321 INFO pool-3-thread-5 ql.Driver - OK
23:24:37,321 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=releaseLocks>
23:24:37,321 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=releaseLocks start=1360790677321 end=1360790677321 duration=0>
23:24:37,321 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=Driver.run start=1360790620922 end=1360790677321 duration=56399>
23:24:38,012 INFO pool-3-thread-5 mapred.FileInputFormat - Total input paths to process : 1
23:24:38,387 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Shutting down the object store...
23:24:38,388 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Shutting down the object store...
23:24:38,388 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Metastore shutdown complete.
23:24:38,388 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Metastore shutdown complete.
23:24:38,388 ERROR pool-3-thread-5 server.TThreadPoolServer - Error occurred during processing of message.
java.lang.NullPointerException
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:34)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Hive history file=/tmp/costin/hive_job_log_costin_201302132324_1199892416.txt
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,397 INFO pool-3-thread-5 exec.HiveHistory - Hive history file=/tmp/costin/hive_job_log_costin_201302132324_1199892416.txt
23:24:38,442 INFO pool-3-thread-5 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323247115435583898915024.pipeout
23:24:38,443 INFO pool-3-thread-5 service.HiveServer - Running the query: select count(1) as cnt from testHiveDriverTable
23:24:38,443 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=Driver.run>
23:24:38,443 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=TimeToSubmit>
23:24:38,443 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=compile>
23:24:38,443 INFO pool-3-thread-5 parse.ParseDriver - Parsing command: select count(1) as cnt from testHiveDriverTable
23:24:38,445 INFO pool-3-thread-5 parse.ParseDriver - Parse Completed
23:24:38,446 INFO pool-3-thread-5 parse.SemanticAnalyzer - Starting Semantic Analysis
23:24:38,446 INFO pool-3-thread-5 parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
23:24:38,446 INFO pool-3-thread-5 parse.SemanticAnalyzer - Get metadata for source tables
23:24:38,446 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: get_table : db=default tbl=testHiveDriverTable
23:24:38,446 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_table : db=default tbl=testHiveDriverTable
23:24:38,461 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:24:38,461 INFO pool-3-thread-5 parse.SemanticAnalyzer - Get metadata for subqueries
23:24:38,461 INFO pool-3-thread-5 parse.SemanticAnalyzer - Get metadata for destination tables
23:24:38,545 INFO pool-3-thread-5 parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
23:24:38,545 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:24:38,552 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for FS(34)
23:24:38,552 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for SEL(33)
23:24:38,552 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for GBY(32)
23:24:38,552 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for RS(31)
23:24:38,553 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for GBY(30)
23:24:38,553 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for SEL(29)
23:24:38,553 INFO pool-3-thread-5 ppd.OpProcFactory - Processing for TS(28)
23:24:38,557 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:24:38,557 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:24:38,557 INFO pool-3-thread-5 hive.log - DDL: struct testhivedrivertable { i32 key, string value}
23:24:38,558 INFO pool-3-thread-5 physical.MetadataOnlyOptimizer - Looking for table scans where optimization is applicable
23:24:38,559 INFO pool-3-thread-5 physical.MetadataOnlyOptimizer - Found 0 metadata only table scans
23:24:38,559 INFO pool-3-thread-5 parse.SemanticAnalyzer - Completed plan generation
23:24:38,559 INFO pool-3-thread-5 ql.Driver - Semantic Analysis Completed
23:24:38,561 INFO pool-3-thread-5 ql.Driver - Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:cnt, type:bigint, comment:null)], properties:null)
23:24:38,562 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=compile start=1360790678443 end=1360790678562 duration=119>
23:24:38,562 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=Driver.execute>
23:24:38,562 INFO pool-3-thread-5 ql.Driver - Starting command: select count(1) as cnt from testHiveDriverTable
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Total MapReduce jobs = 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,563 INFO pool-3-thread-5 ql.Driver - Total MapReduce jobs = 1
23:24:38,563 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=TimeToSubmit start=1360790678443 end=1360790678563 duration=120>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Launching Job 1 out of 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,564 INFO pool-3-thread-5 ql.Driver - Launching Job 1 out of 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Number of reduce tasks determined at compile time: 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,566 INFO pool-3-thread-5 exec.Task - Number of reduce tasks determined at compile time: 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
In order to change the average load for a reducer (in bytes):
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,566 INFO pool-3-thread-5 exec.Task - In order to change the average load for a reducer (in bytes):
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
set hive.exec.reducers.bytes.per.reducer=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,566 INFO pool-3-thread-5 exec.Task - set hive.exec.reducers.bytes.per.reducer=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
In order to limit the maximum number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,567 INFO pool-3-thread-5 exec.Task - In order to limit the maximum number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
set hive.exec.reducers.max=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,567 INFO pool-3-thread-5 exec.Task - set hive.exec.reducers.max=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
In order to set a constant number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,567 INFO pool-3-thread-5 exec.Task - In order to set a constant number of reducers:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
set mapred.reduce.tasks=<number>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:24:38,567 INFO pool-3-thread-5 exec.Task - set mapred.reduce.tasks=<number>
23:24:38,652 INFO pool-3-thread-5 exec.ExecDriver - Using org.apache.hadoop.hive.ql.io.CombineHiveInputFormat
23:24:38,653 INFO pool-3-thread-5 exec.ExecDriver - adding libjars: file:///C:/Users/costin/.gradle/caches/artifacts-15/filestore/org.apache.hive/hive-builtins/0.9.0-cdh4.1.3/jar/6a864aef6987ba91effcb7139e9127f6952be29c/hive-builtins-0.9.0-cdh4.1.3.jar
23:24:38,653 INFO pool-3-thread-5 exec.ExecDriver - Processing alias testhivedrivertable
23:24:38,653 INFO pool-3-thread-5 exec.ExecDriver - Adding input file hdfs://xxx:xxx/user/hive/warehouse/testhivedrivertable
23:24:38,653 INFO pool-3-thread-5 exec.Utilities - Content Summary not cached for hdfs://xxx:xxx/user/hive/warehouse/testhivedrivertable
23:24:39,225 INFO pool-3-thread-5 exec.ExecDriver - Changed input file to hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-24-38_443_7186141473453631953/-mr-10002/1
23:24:40,513 INFO pool-3-thread-5 exec.ExecDriver - Making Temp Directory: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-24-38_443_7186141473453631953/-ext-10001
23:24:40,934 WARN pool-3-thread-5 mapred.JobClient - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
23:25:00,295 INFO pool-3-thread-5 io.CombineHiveInputFormat - CombineHiveInputSplit creating pool for hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-24-38_443_7186141473453631953/-mr-10002/1; using filter path hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-24-38_443_7186141473453631953/-mr-10002/1
23:25:00,459 INFO pool-3-thread-5 mapred.FileInputFormat - Total input paths to process : 1
23:25:00,631 INFO pool-3-thread-5 io.CombineHiveInputFormat - number of splits 1
23:25:02,579 WARN pool-3-thread-5 conf.Configuration - fs.default.name is deprecated. Instead, use fs.defaultFS
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Starting Job = job_201212270850_1368, Tracking URL = http://0.0.0.0:50030/jobdetails.jsp?jobid=job_201212270850_1368
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:03,835 INFO pool-3-thread-5 exec.Task - Starting Job = job_201212270850_1368, Tracking URL = http://0.0.0.0:50030/jobdetails.jsp?jobid=job_201212270850_1368
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Kill Command = \usr\bin\hadoop job -Dmapred.job.tracker=sof-40612-vm03.eng.vmware.com:8021 -kill job_201212270850_1368
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:03,836 INFO pool-3-thread-5 exec.Task - Kill Command = \usr\bin\hadoop job -Dmapred.job.tracker=sof-40612-vm03.eng.vmware.com:8021 -kill job_201212270850_1368
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:11,007 INFO pool-3-thread-5 exec.Task - Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
23:25:11,262 WARN pool-3-thread-5 mapreduce.Counters - Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:11,262 Stage-1 map = 0%, reduce = 0%
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:11,263 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:11,262 Stage-1 map = 0%, reduce = 0%
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:19,288 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:19,291 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:19,288 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:20,628 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:20,631 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:20,628 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:21,969 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:21,971 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:21,969 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:23,301 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:23,303 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:23,301 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:24,637 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:24,640 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:24,637 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:25,972 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:25,974 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:25,972 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:27,312 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:27,315 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:27,312 Stage-1 map = 100%, reduce = 0%, Cumulative CPU 1.25 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:29,101 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:29,104 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:29,101 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:30,438 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:30,441 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:30,438 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:31,784 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:31,786 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:31,784 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:33,119 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:33,122 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:33,119 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:34,458 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:34,461 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:34,458 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
2013-02-13 23:25:35,802 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:35,806 INFO pool-3-thread-5 exec.Task - 2013-02-13 23:25:35,802 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 4.45 sec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
MapReduce Total cumulative CPU time: 4 seconds 450 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:35,888 INFO pool-3-thread-5 exec.Task - MapReduce Total cumulative CPU time: 4 seconds 450 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Ended Job = job_201212270850_1368
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:36,056 INFO pool-3-thread-5 exec.Task - Ended Job = job_201212270850_1368
23:25:36,467 INFO pool-3-thread-5 exec.FileSinkOperator - Moving tmp dir: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-24-38_443_7186141473453631953/_tmp.-ext-10001 to: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-24-38_443_7186141473453631953/_tmp.-ext-10001.intermediate
23:25:36,640 INFO pool-3-thread-5 exec.FileSinkOperator - Moving tmp dir: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-24-38_443_7186141473453631953/_tmp.-ext-10001.intermediate to: hdfs://xxx:xxx/tmp/hive-costin/hive_2013-02-13_23-24-38_443_7186141473453631953/-ext-10001
23:25:36,890 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=Driver.execute start=1360790678562 end=1360790736890 duration=58328>
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
MapReduce Jobs Launched:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:36,890 INFO pool-3-thread-5 ql.Driver - MapReduce Jobs Launched:
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Job 0: Map: 1 Reduce: 1 Cumulative CPU: 4.45 sec HDFS Read: 0 HDFS Write: 0 SUCCESS
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:36,891 INFO pool-3-thread-5 ql.Driver - Job 0: Map: 1 Reduce: 1 Cumulative CPU: 4.45 sec HDFS Read: 0 HDFS Write: 0 SUCCESS
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
Total MapReduce CPU Time Spent: 4 seconds 450 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:36,891 INFO pool-3-thread-5 ql.Driver - Total MapReduce CPU Time Spent: 4 seconds 450 msec
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_ERROR
OK
org.springframework.data.hadoop.hive.BasicHiveTest > testQueryForLong STANDARD_OUT
23:25:36,891 INFO pool-3-thread-5 ql.Driver - OK
23:25:36,892 INFO pool-3-thread-5 ql.Driver - <PERFLOG method=releaseLocks>
23:25:36,892 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=releaseLocks start=1360790736892 end=1360790736892 duration=0>
23:25:36,892 INFO pool-3-thread-5 ql.Driver - </PERFLOG method=Driver.run start=1360790678443 end=1360790736892 duration=58449>
23:25:37,275 INFO pool-3-thread-5 mapred.FileInputFormat - Total input paths to process : 1
23:25:37,615 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Shutting down the object store...
org.springframework.data.hadoop.hive.BasicHiveTest STANDARD_OUT
23:25:37,616 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Shutting down the object store...
23:25:37,616 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Metastore shutdown complete.
23:25:37,616 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Metastore shutdown complete.
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveTemplate STANDARD_OUT
23:25:37,616 ERROR pool-3-thread-5 server.TThreadPoolServer - Error occurred during processing of message.
java.lang.NullPointerException
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:34)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveTemplate STANDARD_ERROR
Hive history file=/tmp/costin/hive_job_log_costin_201302132325_2022819645.txt
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveTemplate STANDARD_OUT
23:25:37,628 INFO pool-3-thread-5 exec.HiveHistory - Hive history file=/tmp/costin/hive_job_log_costin_201302132325_2022819645.txt
23:25:37,667 INFO pool-3-thread-5 service.HiveServer - Putting temp output to file \tmp\costin\costin_2013021323259187765026367644572.pipeout
23:25:37,668 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: get_all_databases
23:25:37,668 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=get_all_databases
23:25:37,668 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
23:25:37,669 INFO pool-3-thread-5 metastore.ObjectStore - ObjectStore, initialize called
23:25:37,669 INFO pool-3-thread-5 metastore.ObjectStore - Initialized ObjectStore
[default]
23:25:37,676 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Shutting down the object store...
org.springframework.data.hadoop.hive.BasicHiveTest STANDARD_OUT
23:25:37,676 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Shutting down the object store...
23:25:37,677 INFO pool-3-thread-5 metastore.HiveMetaStore - 4: Metastore shutdown complete.
org.springframework.data.hadoop.hive.BasicHiveTest > testHiveServer STANDARD_OUT
23:25:37,677 INFO pool-3-thread-5 HiveMetaStore.audit - ugi=costin ip=unknown-ip-addr cmd=Metastore shutdown complete.
23:25:37,677 ERROR pool-3-thread-5 server.TThreadPoolServer - Error occurred during processing of message.
java.lang.NullPointerException
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:34)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:34)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:176)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Gradle Worker 1 finished executing tests.
Process 'Gradle Worker 1' finished with exit value 0 (state: SUCCEEDED)
BUILD SUCCESSFUL
Total time: 3 mins 15.389 secs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.