Skip to content

Instantly share code, notes, and snippets.

@vishwakarma
Created April 14, 2017 04:42
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save vishwakarma/cefdc25beadfd1de5121cbda13d2f4fe to your computer and use it in GitHub Desktop.
Save vishwakarma/cefdc25beadfd1de5121cbda13d2f4fe to your computer and use it in GitHub Desktop.
(venv) [gopi2667@ip-172-31-60-179 my_project]$ pip install pydoop
Collecting pydoop
Using cached pydoop-1.2.0.tar.gz
Requirement already satisfied: setuptools>=3.3 in ./venv/lib/python2.7/site-packages (from pydoop)
Requirement already satisfied: six>=1.6.0 in ./venv/lib/python2.7/site-packages (from setuptools>=3.3->pydoop)
Requirement already satisfied: appdirs>=1.4.0 in ./venv/lib/python2.7/site-packages (from setuptools>=3.3->pydoop)
Requirement already satisfied: packaging>=16.8 in ./venv/lib/python2.7/site-packages (from setuptools>=3.3->pydoop)
Requirement already satisfied: pyparsing in ./venv/lib/python2.7/site-packages (from packaging>=16.8->setuptools>=3.3->pydoop)
Building wheels for collected packages: pydoop
Running setup.py bdist_wheel for pydoop ... error
Complete output from command /home/gopi2667/my_project/venv/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-fTEdvp/pydoop/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmpZlHHm3pip-wheel- --python-tag cp27:
using setuptools version 34.4.1
running bdist_wheel
running build
hdfs core implementation: native
running build_py
creating build
creating build/lib
creating build/lib/pydoop
copying pydoop/test_support.py -> build/lib/pydoop
copying pydoop/__init__.py -> build/lib/pydoop
copying pydoop/pipes.py -> build/lib/pydoop
copying pydoop/hadoop_utils.py -> build/lib/pydoop
copying pydoop/jc.py -> build/lib/pydoop
copying pydoop/avrolib.py -> build/lib/pydoop
copying pydoop/test_utils.py -> build/lib/pydoop
copying pydoop/hadut.py -> build/lib/pydoop
copying pydoop/config.py -> build/lib/pydoop
copying pydoop/version.py -> build/lib/pydoop
creating build/lib/pydoop/mapreduce
copying pydoop/mapreduce/__init__.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/api.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/connections.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/pipes.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/streams.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/simulator.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/string_utils.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/binary_streams.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/jwritable_utils.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/text_streams.py -> build/lib/pydoop/mapreduce
creating build/lib/pydoop/utils
copying pydoop/utils/__init__.py -> build/lib/pydoop/utils
copying pydoop/utils/jvm.py -> build/lib/pydoop/utils
copying pydoop/utils/serialize.py -> build/lib/pydoop/utils
copying pydoop/utils/conversion_tables.py -> build/lib/pydoop/utils
copying pydoop/utils/misc.py -> build/lib/pydoop/utils
creating build/lib/pydoop/hdfs
copying pydoop/hdfs/path.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/__init__.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/fs.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/common.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/file.py -> build/lib/pydoop/hdfs
creating build/lib/pydoop/app
copying pydoop/app/__init__.py -> build/lib/pydoop/app
copying pydoop/app/script_template.py -> build/lib/pydoop/app
copying pydoop/app/submit.py -> build/lib/pydoop/app
copying pydoop/app/script.py -> build/lib/pydoop/app
copying pydoop/app/main.py -> build/lib/pydoop/app
copying pydoop/app/argparse_types.py -> build/lib/pydoop/app
creating build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/pyjnius_loader.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/__init__.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/factory.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/jpype_loader.py -> build/lib/pydoop/utils/bridge
creating build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/__init__.py -> build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/api.py -> build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/impl.py -> build/lib/pydoop/hdfs/core
creating build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/hadoop.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/__init__.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/common.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/pydoop.properties -> build/lib/pydoop
running build_ext
building 'pydoop.sercore' extension
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/src
creating build/temp.linux-x86_64-2.7/src/serialize
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/protocol_codec.cc -o build/temp.linux-x86_64-2.7/src/serialize/protocol_codec.o -Wno-write-strings -O3
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/SerialUtils.cc -o build/temp.linux-x86_64-2.7/src/serialize/SerialUtils.o -Wno-write-strings -O3
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/StringUtils.cc -o build/temp.linux-x86_64-2.7/src/serialize/StringUtils.o -Wno-write-strings -O3
g++ -pthread -shared -Wl,-z,relro build/temp.linux-x86_64-2.7/src/serialize/protocol_codec.o build/temp.linux-x86_64-2.7/src/serialize/SerialUtils.o build/temp.linux-x86_64-2.7/src/serialize/StringUtils.o -L/usr/lib64 -lpython2.7 -o build/lib/pydoop/sercore.so
building 'pydoop.native_core_hdfs' extension
creating build/temp.linux-x86_64-2.7/src/libhdfsV2
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/common
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/os
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix
creating build/temp.linux-x86_64-2.7/src/native_core_hdfs
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.91-0.b14.el7_2.x86_64/include -Inative/jni_include -I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.91-0.b14.el7_2.x86_64/lib -I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.91-0.b14.el7_2.x86_64/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/exception.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/exception.o -Wno-write-strings
In file included from src/libhdfsV2/exception.c:19:0:
src/libhdfsV2/exception.h:39:17: fatal error: jni.h: No such file or directory
#include <jni.h>
^
compilation terminated.
error: command 'gcc' failed with exit status 1
----------------------------------------
Failed building wheel for pydoop
Running setup.py clean for pydoop
Failed to build pydoop
Installing collected packages: pydoop
Running setup.py install for pydoop ... error
Complete output from command /home/gopi2667/my_project/venv/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-fTEdvp/pydoop/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-pVGS4K-record/install-record.txt --single-version-externally-managed --compile --install-headers /home/gopi2667/my_project/venv/include/site/python2.7/pydoop:
using setuptools version 34.4.1
running install
running build
hdfs core implementation: native
running build_py
creating build
creating build/lib
creating build/lib/pydoop
copying pydoop/test_support.py -> build/lib/pydoop
copying pydoop/__init__.py -> build/lib/pydoop
copying pydoop/pipes.py -> build/lib/pydoop
copying pydoop/hadoop_utils.py -> build/lib/pydoop
copying pydoop/jc.py -> build/lib/pydoop
copying pydoop/avrolib.py -> build/lib/pydoop
copying pydoop/test_utils.py -> build/lib/pydoop
copying pydoop/hadut.py -> build/lib/pydoop
copying pydoop/config.py -> build/lib/pydoop
copying pydoop/version.py -> build/lib/pydoop
creating build/lib/pydoop/mapreduce
copying pydoop/mapreduce/__init__.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/api.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/connections.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/pipes.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/streams.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/simulator.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/string_utils.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/binary_streams.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/jwritable_utils.py -> build/lib/pydoop/mapreduce
copying pydoop/mapreduce/text_streams.py -> build/lib/pydoop/mapreduce
creating build/lib/pydoop/utils
copying pydoop/utils/__init__.py -> build/lib/pydoop/utils
copying pydoop/utils/jvm.py -> build/lib/pydoop/utils
copying pydoop/utils/serialize.py -> build/lib/pydoop/utils
copying pydoop/utils/conversion_tables.py -> build/lib/pydoop/utils
copying pydoop/utils/misc.py -> build/lib/pydoop/utils
creating build/lib/pydoop/hdfs
copying pydoop/hdfs/path.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/__init__.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/fs.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/common.py -> build/lib/pydoop/hdfs
copying pydoop/hdfs/file.py -> build/lib/pydoop/hdfs
creating build/lib/pydoop/app
copying pydoop/app/__init__.py -> build/lib/pydoop/app
copying pydoop/app/script_template.py -> build/lib/pydoop/app
copying pydoop/app/submit.py -> build/lib/pydoop/app
copying pydoop/app/script.py -> build/lib/pydoop/app
copying pydoop/app/main.py -> build/lib/pydoop/app
copying pydoop/app/argparse_types.py -> build/lib/pydoop/app
creating build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/pyjnius_loader.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/__init__.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/factory.py -> build/lib/pydoop/utils/bridge
copying pydoop/utils/bridge/jpype_loader.py -> build/lib/pydoop/utils/bridge
creating build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/__init__.py -> build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/api.py -> build/lib/pydoop/hdfs/core
copying pydoop/hdfs/core/impl.py -> build/lib/pydoop/hdfs/core
creating build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/hadoop.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/__init__.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/hdfs/core/bridged/common.py -> build/lib/pydoop/hdfs/core/bridged
copying pydoop/pydoop.properties -> build/lib/pydoop
running build_ext
building 'pydoop.sercore' extension
creating build/temp.linux-x86_64-2.7
creating build/temp.linux-x86_64-2.7/src
creating build/temp.linux-x86_64-2.7/src/serialize
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/protocol_codec.cc -o build/temp.linux-x86_64-2.7/src/serialize/protocol_codec.o -Wno-write-strings -O3
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/SerialUtils.cc -o build/temp.linux-x86_64-2.7/src/serialize/SerialUtils.o -Wno-write-strings -O3
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -I/usr/include/python2.7 -c src/serialize/StringUtils.cc -o build/temp.linux-x86_64-2.7/src/serialize/StringUtils.o -Wno-write-strings -O3
g++ -pthread -shared -Wl,-z,relro build/temp.linux-x86_64-2.7/src/serialize/protocol_codec.o build/temp.linux-x86_64-2.7/src/serialize/SerialUtils.o build/temp.linux-x86_64-2.7/src/serialize/StringUtils.o -L/usr/lib64 -lpython2.7 -o build/lib/pydoop/sercore.so
building 'pydoop.native_core_hdfs' extension
creating build/temp.linux-x86_64-2.7/src/libhdfsV2
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/common
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/os
creating build/temp.linux-x86_64-2.7/src/libhdfsV2/os/posix
creating build/temp.linux-x86_64-2.7/src/native_core_hdfs
gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -DHADOOP_LIBHDFS_V2=1 -I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.91-0.b14.el7_2.x86_64/include -Inative/jni_include -I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.91-0.b14.el7_2.x86_64/lib -I/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.91-0.b14.el7_2.x86_64/include/linux -Isrc/libhdfsV2 -Isrc/libhdfsV2/os/posix -I/usr/include/python2.7 -c src/libhdfsV2/exception.c -o build/temp.linux-x86_64-2.7/src/libhdfsV2/exception.o -Wno-write-strings
In file included from src/libhdfsV2/exception.c:19:0:
src/libhdfsV2/exception.h:39:17: fatal error: jni.h: No such file or directory
#include <jni.h>
^
compilation terminated.
error: command 'gcc' failed with exit status 1
----------------------------------------
Command "/home/gopi2667/my_project/venv/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-fTEdvp/pydoop/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-pVGS4K-record/install-record.txt --single-version-externally-managed --compile --install-headers /home/gopi2667/my_project/venv/include/site/python2.7/pydoop" failed with error code 1 in /tmp/pip-build-fTEdvp/pydoop/
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment