Skip to content

Instantly share code, notes, and snippets.

@shijinkui
Last active December 26, 2015 22:49
Show Gist options
  • Save shijinkui/7226043 to your computer and use it in GitHub Desktop.
Save shijinkui/7226043 to your computer and use it in GitHub Desktop.
shark+spark+hive 安装

安装文档:https://github.com/amplab/shark/wiki/_pages

0.8 遇到的问题:

  1. mysql驱动

hive> show tables;
FAILED: Error in metadata: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

把mysql-connector-java-5.1.26.jar放到hive/lib下

  1. hadoop版本不一致, 去掉1.x的版本,加入0.23.9的jar

13/10/30 09:52:57 ERROR exec.Task: FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Server IPC version 5 cannot communicate with client version 4)
org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException Server IPC version 5 cannot communicate with client version 4)
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:544)
	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3313)
	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:242)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1312)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1104)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:937)
	at shark.SharkCliDriver.processCmd(SharkCliDriver.scala:294)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:341)
	at shark.SharkCliDriver$.main(SharkCliDriver.scala:203)
	at shark.SharkCliDriver.main(SharkCliDriver.scala)
	

加入jar到jars目录


jars/hadoop-auth-0.23.9.jar    jars/hadoop-hdfs-0.23.9.jar                     jars/hadoop-mapreduce-client-core-0.23.9.jar
jars/hadoop-common-0.23.9.jar  jars/hadoop-mapreduce-client-common-0.23.9.jar

java.lang.ExceptionInInitializerError
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2204)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2176)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:306)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:163)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:290)
	at org.apache.hadoop.fs.Path.getFileSystem(Path.java:194)
	at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:104)
	at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:136)
	at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:151)
	at org.apache.hadoop.hive.metastore.Warehouse.getDatabasePath(Warehouse.java:162)
	at org.apache.hadoop.hive.metastore.Warehouse.getTablePath(Warehouse.java:169)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:798)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table(HiveMetaStore.java:849)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:402)
	at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:538)
	at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3313)
	at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:242)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1312)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1104)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:937)
	at shark.SharkCliDriver.processCmd(SharkCliDriver.scala:294)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:341)
	at shark.SharkCliDriver$.main(SharkCliDriver.scala:203)
	at shark.SharkCliDriver.main(SharkCliDriver.scala)
Caused by: java.lang.IllegalStateException: Shutdown in progress
	at java.lang.ApplicationShutdownHooks.add(ApplicationShutdownHooks.java:39)
	at java.lang.Runtime.addShutdownHook(Runtime.java:192)
	at org.apache.hadoop.util.ShutdownHookManager.<clinit>(ShutdownHookManager.java:47)
	... 27 more

  1. 竟然有隐藏的jar,果断删掉

find lib_managed -type f -name "._*.jar"|xargs -i rm -fr "{}"

/opt/soft/shark-0.8.0/lib_managed/jars/org.apache.hadoop/hadoop-client/._hadoop-client-1.0.4.jar	System Classpath
/opt/soft/shark-0.8.0/lib_managed/jars/org.apache.hadoop/hadoop-client/hadoop-client-1.0.4.jar	System Classpath
/opt/soft/shark-0.8.0/lib_managed/jars/org.apache.hadoop/hadoop-core/._hadoop-core-1.0.4.jar
  1. hadoop dfsadmin -safemode leave

Resources are low on NN. Safe mode must be turned off manually.
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2088)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2066)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:590)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:394)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1571)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1567)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1262)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1565)

13/10/30 11:46:26 ERROR hive.log: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /opt/soft/warehouse/src2. Name node is in safe mode.


Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment