Skip to content

Instantly share code, notes, and snippets.

@zedar
Last active August 13, 2020 00:59
Show Gist options
  • Save zedar/f631ace0759c1d512573 to your computer and use it in GitHub Desktop.
Save zedar/f631ace0759c1d512573 to your computer and use it in GitHub Desktop.
Add native libraries to Apache Hadoop installation

Apache Hadoop - add native libraries

If native libraries are not available the following message is displayed with every hadoop command: hadoop checknative

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
  • Clone hadoop source code

$ git clone https://github.com/apache/hadoop.git
$ cd hadoop
  • Checkout the version 2.7.1 source

$ git checkout branch-2.7.1
  • Install required dependencies - OSX: use brew or any other package manager

$ brew install cmake
$ brew install zlib
$ brew install protobuf
$ brew install snappy
  • Build project and native dependencies with maven

$ mvn package -Pdist,native -DskipTests -Dtar
  • Copy newly created libraries to the hadoop installation

$ mkdir -p $HADOOP_INSTALL/lib/native/osx
$ cp -r hadoop-dist/target/hadoop-2.7.1/lib/native/* $HADOOP_INSTALL/lib/native/osx
  • Add shell variables either to the ~/.bash_profile, or $HADOOP_INSTALL/etc/hadoop/hadoop-env.sh

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_INSTALL/lib/native/osx”

Check if native libraries are available:

$ hadoop checknative
Warning
above rules do not cause bzip2 native libs to work in OSX.
@jsathish
Copy link

download protoc@250 source from
https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.bz2
and run below commands to build locally
tar xfvj protobuf-2.5.0.tar.bz2
protobuf-2.5.0/configure
make or make check
make install

@zhuguangbin
Copy link

brew install https://raw.githubusercontent.com/Homebrew/homebrew-core/0f0b2fc5e2541712b0bb06f74cc1559b1c884750/Formula/protobuf@2.5.rb

作者:tison
链接:https://www.zhihu.com/question/360006055/answer/928011185
来源:知乎
著作权归作者所有。商业转载请联系作者获得授权,非商业转载请注明出处。

@eastcirclek
Copy link

eastcirclek commented Aug 13, 2020

hadoop-pipes related

error

ld: cannot link directly with dylib/framework, your binary is not an allowed client of /usr/lib/libcrypto.dylib for architecture x86_64

situation

  • macOS Catalina
  • openssl 1.1 already installed by brew

solution

Open and modify hadoop-tools/hadoop-pipes/pom.xml

  • <openssl.prefix>/usr/local/opt/openssl</openssl.prefix>

rm -rf hadoop-tools/hadoop-pipes/target

note

Exporting OPENSSL_ROOT_DIR is not enough because of the following profile:

    <profile>
      <id>native</id>
      <activation>
        <activeByDefault>false</activeByDefault>
      </activation>
      <properties>
        <openssl.prefix></openssl.prefix>
      </properties>
      <build>
        <plugins>
          <plugin>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-maven-plugins</artifactId>
            <executions>
              <execution>
                <id>cmake-compile</id>
                <phase>compile</phase>
                <goals><goal>cmake-compile</goal></goals>
                <configuration>
                  <source>${basedir}/src</source>
                  <vars>
                    <JVM_ARCH_DATA_MODEL>${sun.arch.data.model}</JVM_ARCH_DATA_MODEL>
                    <OPENSSL_ROOT_DIR>${openssl.prefix} </OPENSSL_ROOT_DIR>
                  </vars>
                </configuration>
              </execution>
            </executions>
          </plugin>
        </plugins>
      </build>
    </profile>

$openssl.prefix is going to override OPENSSL_ROOT_DIR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment