Skip to content

Instantly share code, notes, and snippets.

@sp3c73r2038
Created July 11, 2012 01:31
Show Gist options
  • Save sp3c73r2038/3087358 to your computer and use it in GitHub Desktop.
Save sp3c73r2038/3087358 to your computer and use it in GitHub Desktop.
truble shooting of scribe

truble shooting of scribe

依赖

  • 整体
    • 这些依赖是必须的,以RHEL/CentOS举例
yum install automake libtool flex bison pkgconfig gcc-c++ libevent-devel

# 如果用系统自带的boost,RHEL5默认的boost版本过低
yum install boost-devel

# 如果要编译对应语言的binding,记得安装对应的源代码,虽然python和ruby真心不推荐系统自带的...
yum install python-devel ruby-devel php-devel

# java的还是安装一下吧,如果想要hadoop/hdfs支持的话
yum install java-1.6.0-openjdk-devel
  • libevent
    • 这个记得一定要安装,否则thrift和fb303编译的时候会报依赖的event.h找不到
  • boost
    • thrift、fb303和scribe都依赖于boost的c++库,版本要求至少1.36以上
    • 而在1.45开始(比如我用的是1.49),boost的filesystem版本从v2到v3,注意scribe只支持到v2,所以编译scribe的时候,需要设置一下
CPPFLAGS="-DBOOST_FILESYSTEM_VERSION=2"
  • thrift
    • 最新版本是0.8,scribe最新版本2.2也只支持到0.4,所以...
    • 0.4的configure参数很奇葩,--with-gen-xx这些参数可以不指定,否则会报解释器/编译器找不到,不想要某个语言的binding就用和0.8一样的参数--without-xx就可以了
    • 最好不要用并行编译,以防止莫名其妙的错误
make -j 1
  • fb303
    • 编译的时候会报错
apache::thrift::protocol::TBinaryProtocolT<Transport_>::writeI32(int32_t)=
=E2=80=99:
/home/evans/.local/include/thrift/protocol/TBinaryProtocol.tcc:154: error: there
are no arguments to =E2=80=98htonl=E2=80=99 that depend on a template par=
ameter, so a
declaration of =E2=80=98htonl=E2=80=99 must be available
make[3]: *** [FacebookBase.o] Error 1
make[3]: Leaving directory `/usr/local/src/thrift-0.8.0/contrib/fb303/cpp=
'
make[2]: *** [all] Error 2
make[2]: Leaving directory `/usr/local/src/thrift-0.8.0/contrib/fb303/cpp=
'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/usr/local/src/thrift-0.8.0/contrib/fb303'
make: *** [all] Error 2
  • 因为 <netinet/in.h>没找到,设置一下吧
CPPFLAGS="-DHAVE_INTTYPES_H -DHAVE_NETINET_IN_H"

正题 - scribe本身

该死的HDFS支持

如果你不打算加入HDFS支持,可以跳过以下问题

好吧,这是最奇葩的部分,首先让我们看看hadoop的版本问题

怎么样,够奇葩吧,0.20.1->0.20.205->1.0和 0.21->0.23->2.0 完全是两个分支,功能还有互斥的部分

更加奇葩的是两个分支里关于HdfsFIle.cpphdfsConnectNewInstance类型和deleteHdfs函数的定义还不一样啊,不!一!样!

0.21以上dfsConnectNewInstance根本没定义、deleteHdfs3个参数;scribe会用到dfsConnectNewInstance,但deleteHdfs只有2个参数

放狗后发现了一些细节,得打patch...已经找不到了,可用的是CloudEra提供了CDH3的版本里有patch过的hadoop,很好的解决了这2个问题,所以勇敢的去下载

编译参数

如果之前的编译都是用root身份编译安装到/usr/local下面的,可能不需要指定很多路径,如果你跟我一样蛋疼安装到了其他路径,那么就用下面的参数吧

./configure \
--prefix=/home/evans/.local \
--enable-hdfs \
--with-thriftpath=/home/evans/.local \
--with-fb303path=/home/evans/.local \
--with-hadooppath=/home/evans/hadoop \
--with-boost=/home/evans/.local \
CPPFLAGS="-I$HOME/hadoop-0.20.2-cdh3u4/src/c++/libhdfs -I$HOME/hadoop/src/c++/libhdfs -I/usr/lib/jvm/java-1.6.0-openjdk.x86_64/include/ -I/usr/lib/jvm/java-1.6.0-openjdk.x86_64/include/linux -DHAVE_INTTYPES_H -DHAVE_NETINET_IN_H -DBOOST_FILESYSTEM_VERSION=2 "  \
LDFLAGS="-L$HOME/hadoop-0.20.2-cdh3u4/c++/Linux-amd64-64/lib -L/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/amd64/server"

如果你选择勇敢的面对上面的HDFS支持,那么解释一下这里的CPPFLAGSLDFLAGS

  • 以下3个参数不加,会报类似于hdfs.h找不到啊,或者链接的时候报-lhdfs找不到
    • -I$HOME/hadoop-0.20.2-cdh3u4/src/c++/libhdfs 加入hdfs.h的路径
    • -L$HOME/hadoop-0.20.2-cdh3u4/c++/Linux-amd64-64/lib CDH3里带好编译好的libhdfs.la和so
  • 和java相关的参数,会报-ljvm找不到就用以下的
    • -I/usr/lib/jvm/java-1.6.0-openjdk.x86_64/include/ -I/usr/lib/jvm/java-1.6.0-openjdk.x86_64/include/linux java的源代码位置
    • -L/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/amd64/server
  • -DBOOST_FILESYSTEM_VERSION=2 前面解释过了,如果你使用1.45以上的boost,就得用上

如果你以为这样编译通过make install OK,就可以开玩了那就...

./scribed: error while loading shared libraries: libjvm.so: cannot open shared object file: No such file or directory

FUUUUUUUUUUUUUUUUUUUUUUUUUUUU

暂时没找到好的办法,把之前引用的libjvm.so再做个软链吧...

最后恭喜你...

@daydup
Copy link

daydup commented Mar 27, 2013

hi,我按照你这种方式安装scribe成功了,但是在写入hdfs的时候出问题了:
Creating new category store from model default"
[Thu Mar 28 03:31:29 2013] "store thread starting"
[Thu Mar 28 03:31:29 2013] "[hdfs] Connecting to HDFS for hdfs://zm-99-10:9000/user/hadoop/mydir/passport/ddu"
[Thu Mar 28 03:31:29 2013] "[hdfs] Before hdfsConnectNewInstance(zm-99-10, 9000)"
./src/scribed: symbol lookup error: ./src/scribed: undefined symbol: hdfsConnectNewInstance

不知道什么原因呢。。。。。。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment