yum-config-manager --add-repo http://archive.cloudera.com/cdh5/redhat/6/x86_64/cdh/cloudera-cdh5.repo
yum install impala-server impala-catalog impala-state-store impala-shell
ln -sf /usr/lib/hbase/lib/hbase-client.jar /usr/lib/impala/lib
ln -sf /usr/lib/hbase/lib/hbase-common.jar /usr/lib/impala/lib
ln -sf /usr/lib/hbase/lib/hbase-protocol.jar /usr/lib/impala/lib
echo export JAVA_HOME=/usr/jdk64/jdk1.7.0_45 >> /etc/default/bigtop-utils
for i in server state-store catalog ; do service "impala-$i" start ; done
for i in server state-store catalog ; do service "impala-$i" status ; done
for i in server state-store catalog ; do service "impala-$i" stop ; done
IMPORTANT! Impala looks for configuration files in directories found in
$CLASSPATH
.
Add the following to /etc/hadoop/conf/core-site.xml
:
<property>
<name>dfs.client.read.shortcircuit</name> <value>true</value>
</property>
<property>
<name>dfs.client.read.shortcircuit.skip.checksum</name>
<value>false</value>
</property>
<property>
<name>dfs.datanode.hdfs-blocks-metadata.enabled</name>
<value>true</value>
</property>
Add the following to /etc/hadoop/conf/hdfs-site.xml
:
<property>
<name>dfs.datanode.hdfs-blocks-metadata.enabled</name>
<value>true</value>
</property>
<property>
<name>dfs.block.local-path-access.user</name>
<value>impala</value>
</property>
<property>
<name>dfs.client.file-block-storage-locations.timeout.millis</name>
<value>60000</value>
</property>
Copy conf files to impala configuration directory:
cp /etc/hadoop/conf/*.xml /etc/impala/conf
cp /etc/hive/conf/hive-site.xml /etc/impala/conf
Check for permissions:
chmod a+rx /var/lib/hadoop-hdfs
Restart hadoop & impala.
If something goes wrong look into logs first:
- /var/log/impala/impala-server.log
- /var/log/impala/impala-state-store.log
- /var/log/impala/impala-catalog.log
- /var/log/impala/impalad.ERROR
- /var/log/impala/catalogd.ERROR
- /var/log/impala/statestored.ERROR
- /var/log/hadoop/hdfs/*
Try invalidate metadata;
in impala-shell
.
hi,
I am getting some error when i tried to run impala server on HDP 2.3 + Ambari2.2.1 cluster .
1.When I started impala service.My error log is as follows.
Log file created at: 2016/07/13 13:57:57
Running on machine: Ambari.Agent1
Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
E0713 13:57:57.459189 29458 logging.cc:120] stderr will be logged to this file.
E0713 13:57:57.842584 29458 impalad-main.cc:60] NoClassDefFoundError: org/apache/hadoop/hbase/client/Scan
CAUSED BY: ClassNotFoundException: org.apache.hadoop.hbase.client.Scan
loadFileSystems error:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FileSystem
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FileSystem
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
hdfsBuilderConnect(forceNewInstance=0, nn=default, port=0, kerbTicketCachePath=(NULL), userName=(NULL)) error:
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
2.When I run 'impala - shell' command.My error log is as follows.
Error connecting: TTransportException, Could not connect to Ambari.Agent1:21000
Please give me some good advice