Skip to content

Instantly share code, notes, and snippets.

@carrbrpoa
Created February 18, 2021 20:05
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save carrbrpoa/f441a3e1a4bc59125e970d557c1e1070 to your computer and use it in GitHub Desktop.
Save carrbrpoa/f441a3e1a4bc59125e970d557c1e1070 to your computer and use it in GitHub Desktop.
(venv) me@carrbrpoa:~/git-github/datahub/metadata-ingestion$ pip install -e .
Obtaining file:///home/me/git-github/datahub/metadata-ingestion
Collecting avro-python3>=1.8.2 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/3f/84/ef37f882a7d93674d6fe1aa6e99f18cf2f34e9b775952f3d85587c11c92e/avro-python3-1.10.1.tar.gz
Requested avro-python3>=1.8.2 from https://files.pythonhosted.org/packages/3f/84/ef37f882a7d93674d6fe1aa6e99f18cf2f34e9b775952f3d85587c11c92e/avro-python3-1.10.1.tar.gz#sha256=9027abeab63dd9b66bd3c564fa0670c70f78027ecb1978d96c6af7ed415b626b (from datahub==0.0.1), but installing version file-.avro-VERSION.txt
Requirement already satisfied: avro_gen@ https://api.github.com/repos/rbystrit/avro_gen/tarball/master in /home/me/git-github/datahub/venv/lib/python3.6/site-packages (from datahub==0.0.1)
Requirement already satisfied: click>=7.1.1 in /home/me/git-github/datahub/venv/lib/python3.6/site-packages (from datahub==0.0.1)
Collecting confluent_kafka>=1.5.0 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/52/7f/bb959d5be7d59e758557dea6c2e6f7f8ac3c347953f95af160abc6bad0f3/confluent-kafka-1.6.0.tar.gz
Collecting fastavro>=1.3.0 (from datahub==0.0.1)
Collecting mypy_extensions>=0.4.3 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/5c/eb/975c7c080f3223a5cdaff09612f3a5221e4ba534f7039db34c35d95fa6a5/mypy_extensions-0.4.3-py2.py3-none-any.whl
Collecting pydantic>=1.5.1 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/35/47/7ce1c5dbcdc9dfa51cfc61db7bcccfc118c3cebe302971a5e34dda666c9d/pydantic-1.7.3-py3-none-any.whl
Collecting pyyaml>=5.4.1 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/7a/5b/bc0b5ab38247bba158504a410112b6c03f153c652734ece1849749e5f518/PyYAML-5.4.1-cp36-cp36m-manylinux1_x86_64.whl
Collecting requests>=2.25.1 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/29/c1/24814557f1d22c56d50280771a17307e6bf87b70727d975fd6b2ce6b014a/requests-2.25.1-py2.py3-none-any.whl
Collecting sqlalchemy>=1.3.23 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/41/b2/91fa9844a056ce3e67a028ce44fab07ea5fde399382a9216c6c28b7072f9/SQLAlchemy-1.3.23-cp36-cp36m-manylinux1_x86_64.whl
Collecting toml>=0.10.0 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/44/6f/7120676b6d73228c96e17f1f794d8ab046fc910d781c8d151120c3f1569e/toml-0.10.2-py2.py3-none-any.whl
Collecting dataclasses>=0.6 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/fe/ca/75fac5856ab5cfa51bbbcefa250182e50441074fdc3f803f6e76451fab43/dataclasses-0.8-py3-none-any.whl
Collecting typing_extensions>=3.7.4 (from datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/60/7a/e881b5abb54db0e6e671ab088d079c57ce54e8a01a3ca443f561ccadb37e/typing_extensions-3.7.4.3-py3-none-any.whl
Requirement already satisfied: frozendict in /home/me/git-github/datahub/venv/lib/python3.6/site-packages (from avro_gen@ https://api.github.com/repos/rbystrit/avro_gen/tarball/master->datahub==0.0.1)
Requirement already satisfied: pytz in /home/me/git-github/datahub/venv/lib/python3.6/site-packages (from avro_gen@ https://api.github.com/repos/rbystrit/avro_gen/tarball/master->datahub==0.0.1)
Requirement already satisfied: six in /home/me/git-github/datahub/venv/lib/python3.6/site-packages (from avro_gen@ https://api.github.com/repos/rbystrit/avro_gen/tarball/master->datahub==0.0.1)
Requirement already satisfied: tzlocal in /home/me/git-github/datahub/venv/lib/python3.6/site-packages (from avro_gen@ https://api.github.com/repos/rbystrit/avro_gen/tarball/master->datahub==0.0.1)
Collecting chardet<5,>=3.0.2 (from requests>=2.25.1->datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/19/c7/fa589626997dd07bd87d9269342ccb74b1720384a4d739a1872bd84fbe68/chardet-4.0.0-py2.py3-none-any.whl
Collecting idna<3,>=2.5 (from requests>=2.25.1->datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/a2/38/928ddce2273eaa564f6f50de919327bf3a00f091b5baba8dfa9460f3a8a8/idna-2.10-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.25.1->datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/5e/a0/5f06e1e1d463903cf0c0eebeb751791119ed7a4b3737fdc9a77f1cdfb51f/certifi-2020.12.5-py2.py3-none-any.whl
Collecting urllib3<1.27,>=1.21.1 (from requests>=2.25.1->datahub==0.0.1)
Using cached https://files.pythonhosted.org/packages/23/fc/8a49991f7905261f9ca9df5aa9b58363c3c821ce3e7f671895442b7100f2/urllib3-1.26.3-py2.py3-none-any.whl
Building wheels for collected packages: avro-python3, confluent-kafka
Running setup.py bdist_wheel for avro-python3 ... done
Stored in directory: /home/me/.cache/pip/wheels/65/fe/90/20d6d6d97223d80d20cb390be636619c536edab5658c12bdba
Running setup.py bdist_wheel for confluent-kafka ... error
Complete output from command /home/me/git-github/datahub/venv/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-q8kme6th/confluent-kafka/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmp6erfh8zqpip-wheel- --python-tag cp36:
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-3.6
creating build/lib.linux-x86_64-3.6/confluent_kafka
copying src/confluent_kafka/deserializing_consumer.py -> build/lib.linux-x86_64-3.6/confluent_kafka
copying src/confluent_kafka/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka
copying src/confluent_kafka/error.py -> build/lib.linux-x86_64-3.6/confluent_kafka
copying src/confluent_kafka/serializing_producer.py -> build/lib.linux-x86_64-3.6/confluent_kafka
creating build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/json_schema.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/schema_registry_client.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/protobuf.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/avro.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/error.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
creating build/lib.linux-x86_64-3.6/confluent_kafka/avro
copying src/confluent_kafka/avro/cached_schema_registry_client.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
copying src/confluent_kafka/avro/load.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
copying src/confluent_kafka/avro/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
copying src/confluent_kafka/avro/error.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
creating build/lib.linux-x86_64-3.6/confluent_kafka/serialization
copying src/confluent_kafka/serialization/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/serialization
creating build/lib.linux-x86_64-3.6/confluent_kafka/admin
copying src/confluent_kafka/admin/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/admin
creating build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/verifiable_client.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/verifiable_producer.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/verifiable_consumer.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
creating build/lib.linux-x86_64-3.6/confluent_kafka/avro/serializer
copying src/confluent_kafka/avro/serializer/message_serializer.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro/serializer
copying src/confluent_kafka/avro/serializer/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro/serializer
running build_ext
building 'confluent_kafka.cimpl' extension
creating build/temp.linux-x86_64-3.6
creating build/temp.linux-x86_64-3.6/tmp
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka/src
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src
x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/home/me/git-github/datahub/venv/include -I/usr/include/python3.6m -c /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c -o build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.o
In file included from /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:17:0:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:66:2: error: #error "confluent-kafka-python requires librdkafka v1.6.0 or later. Install the latest version of librdkafka from the Confluent repositories, see http://docs.confluent.io/current/installation.html"
#error "confluent-kafka-python requires librdkafka v1.6.0 or later. Install the latest version of librdkafka from the Confluent repositories, see http://docs.confluent.io/current/installation.html"
^~~~~
In file included from /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:17:0:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:190:46: error: unknown type name ‘rd_kafka_error_t’; did you mean ‘rd_kafka_event_t’?
PyObject *KafkaError_new_from_error_destroy (rd_kafka_error_t *error);
^~~~~~~~~~~~~~~~
rd_kafka_event_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:389:31: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’
PyObject *c_cgmd_to_py (const rd_kafka_consumer_group_metadata_t *cgmd);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:390:1: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’; did you mean ‘rd_kafka_metadata_t’?
rd_kafka_consumer_group_metadata_t *py_to_c_cgmd (PyObject *obj);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_metadata_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:387:46: error: unknown type name ‘rd_kafka_error_t’; did you mean ‘rd_kafka_event_t’?
PyObject *KafkaError_new_from_error_destroy (rd_kafka_error_t *error) {
^~~~~~~~~~~~~~~~
rd_kafka_event_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1317:31: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’
PyObject *c_cgmd_to_py (const rd_kafka_consumer_group_metadata_t *cgmd) {
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c: In function ‘c_cgmd_to_py’:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1321:9: error: unknown type name ‘rd_kafka_error_t’; did you mean ‘rd_kafka_event_t’?
rd_kafka_error_t *error;
^~~~~~~~~~~~~~~~
rd_kafka_event_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1323:17: warning: implicit declaration of function ‘rd_kafka_consumer_group_metadata_write’; did you mean ‘rd_kafka_consume_start_queue’? [-Wimplicit-function-declaration]
error = rd_kafka_consumer_group_metadata_write(cgmd, &buffer, &size);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_consume_start_queue
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1323:15: warning: assignment makes pointer from integer without a cast [-Wint-conversion]
error = rd_kafka_consumer_group_metadata_write(cgmd, &buffer, &size);
^
In file included from /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:17:0:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: warning: implicit declaration of function ‘KafkaError_new_from_error_destroy’; did you mean ‘cfl_PyErr_from_error_destroy’? [-Wimplicit-function-declaration]
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: note: in definition of macro ‘cfl_PyErr_from_error_destroy’
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: warning: initialization makes pointer from integer without a cast [-Wint-conversion]
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: note: in definition of macro ‘cfl_PyErr_from_error_destroy’
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c: At top level:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1340:1: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’; did you mean ‘rd_kafka_metadata_t’?
rd_kafka_consumer_group_metadata_t *py_to_c_cgmd (PyObject *obj) {
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_metadata_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c: In function ‘py_to_c_cgmd’:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1341:9: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’; did you mean ‘rd_kafka_metadata_t’?
rd_kafka_consumer_group_metadata_t *cgmd;
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_metadata_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1342:9: error: unknown type name ‘rd_kafka_error_t’; did you mean ‘rd_kafka_event_t’?
rd_kafka_error_t *error;
^~~~~~~~~~~~~~~~
rd_kafka_event_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1349:17: warning: implicit declaration of function ‘rd_kafka_consumer_group_metadata_read’; did you mean ‘rd_kafka_consume_start_queue’? [-Wimplicit-function-declaration]
error = rd_kafka_consumer_group_metadata_read(&cgmd,
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_consume_start_queue
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1349:15: warning: assignment makes pointer from integer without a cast [-Wint-conversion]
error = rd_kafka_consumer_group_metadata_read(&cgmd,
^
In file included from /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:17:0:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: warning: initialization makes pointer from integer without a cast [-Wint-conversion]
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: note: in definition of macro ‘cfl_PyErr_from_error_destroy’
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c: In function ‘error_cb’:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1379:20: error: ‘RD_KAFKA_RESP_ERR__FATAL’ undeclared (first use in this function); did you mean ‘RD_KAFKA_RESP_ERR__FAIL’?
if (err == RD_KAFKA_RESP_ERR__FATAL) {
^~~~~~~~~~~~~~~~~~~~~~~~
RD_KAFKA_RESP_ERR__FAIL
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1379:20: note: each undeclared identifier is reported only once for each function it appears in
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1381:23: warning: implicit declaration of function ‘rd_kafka_fatal_error’; did you mean ‘rd_kafka_last_error’? [-Wimplicit-function-declaration]
err = rd_kafka_fatal_error(rk, errstr, sizeof(errstr));
^~~~~~~~~~~~~~~~~~~~
rd_kafka_last_error
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
----------------------------------------
Failed building wheel for confluent-kafka
Running setup.py clean for confluent-kafka
Successfully built avro-python3
Failed to build confluent-kafka
Installing collected packages: avro-python3, confluent-kafka, fastavro, mypy-extensions, dataclasses, pydantic, pyyaml, chardet, idna, certifi, urllib3, requests, sqlalchemy, toml, typing-extensions, datahub
Found existing installation: avro-python3 file-.avro-VERSION.txt
Uninstalling avro-python3-file-.avro-VERSION.txt:
Successfully uninstalled avro-python3-file-.avro-VERSION.txt
Running setup.py install for confluent-kafka ... error
Complete output from command /home/me/git-github/datahub/venv/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-q8kme6th/confluent-kafka/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-wys0_d8e-record/install-record.txt --single-version-externally-managed --compile --install-headers /home/me/git-github/datahub/venv/include/site/python3.6/confluent-kafka:
running install
running build
running build_py
creating build
creating build/lib.linux-x86_64-3.6
creating build/lib.linux-x86_64-3.6/confluent_kafka
copying src/confluent_kafka/deserializing_consumer.py -> build/lib.linux-x86_64-3.6/confluent_kafka
copying src/confluent_kafka/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka
copying src/confluent_kafka/error.py -> build/lib.linux-x86_64-3.6/confluent_kafka
copying src/confluent_kafka/serializing_producer.py -> build/lib.linux-x86_64-3.6/confluent_kafka
creating build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/json_schema.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/schema_registry_client.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/protobuf.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/avro.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
copying src/confluent_kafka/schema_registry/error.py -> build/lib.linux-x86_64-3.6/confluent_kafka/schema_registry
creating build/lib.linux-x86_64-3.6/confluent_kafka/avro
copying src/confluent_kafka/avro/cached_schema_registry_client.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
copying src/confluent_kafka/avro/load.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
copying src/confluent_kafka/avro/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
copying src/confluent_kafka/avro/error.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro
creating build/lib.linux-x86_64-3.6/confluent_kafka/serialization
copying src/confluent_kafka/serialization/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/serialization
creating build/lib.linux-x86_64-3.6/confluent_kafka/admin
copying src/confluent_kafka/admin/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/admin
creating build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/verifiable_client.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/verifiable_producer.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/verifiable_consumer.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
copying src/confluent_kafka/kafkatest/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/kafkatest
creating build/lib.linux-x86_64-3.6/confluent_kafka/avro/serializer
copying src/confluent_kafka/avro/serializer/message_serializer.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro/serializer
copying src/confluent_kafka/avro/serializer/__init__.py -> build/lib.linux-x86_64-3.6/confluent_kafka/avro/serializer
running build_ext
building 'confluent_kafka.cimpl' extension
creating build/temp.linux-x86_64-3.6
creating build/temp.linux-x86_64-3.6/tmp
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka/src
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka
creating build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src
x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -g -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -fPIC -I/home/me/git-github/datahub/venv/include -I/usr/include/python3.6m -c /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c -o build/temp.linux-x86_64-3.6/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.o
In file included from /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:17:0:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:66:2: error: #error "confluent-kafka-python requires librdkafka v1.6.0 or later. Install the latest version of librdkafka from the Confluent repositories, see http://docs.confluent.io/current/installation.html"
#error "confluent-kafka-python requires librdkafka v1.6.0 or later. Install the latest version of librdkafka from the Confluent repositories, see http://docs.confluent.io/current/installation.html"
^~~~~
In file included from /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:17:0:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:190:46: error: unknown type name ‘rd_kafka_error_t’; did you mean ‘rd_kafka_event_t’?
PyObject *KafkaError_new_from_error_destroy (rd_kafka_error_t *error);
^~~~~~~~~~~~~~~~
rd_kafka_event_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:389:31: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’
PyObject *c_cgmd_to_py (const rd_kafka_consumer_group_metadata_t *cgmd);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:390:1: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’; did you mean ‘rd_kafka_metadata_t’?
rd_kafka_consumer_group_metadata_t *py_to_c_cgmd (PyObject *obj);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_metadata_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:387:46: error: unknown type name ‘rd_kafka_error_t’; did you mean ‘rd_kafka_event_t’?
PyObject *KafkaError_new_from_error_destroy (rd_kafka_error_t *error) {
^~~~~~~~~~~~~~~~
rd_kafka_event_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1317:31: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’
PyObject *c_cgmd_to_py (const rd_kafka_consumer_group_metadata_t *cgmd) {
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c: In function ‘c_cgmd_to_py’:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1321:9: error: unknown type name ‘rd_kafka_error_t’; did you mean ‘rd_kafka_event_t’?
rd_kafka_error_t *error;
^~~~~~~~~~~~~~~~
rd_kafka_event_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1323:17: warning: implicit declaration of function ‘rd_kafka_consumer_group_metadata_write’; did you mean ‘rd_kafka_consume_start_queue’? [-Wimplicit-function-declaration]
error = rd_kafka_consumer_group_metadata_write(cgmd, &buffer, &size);
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_consume_start_queue
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1323:15: warning: assignment makes pointer from integer without a cast [-Wint-conversion]
error = rd_kafka_consumer_group_metadata_write(cgmd, &buffer, &size);
^
In file included from /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:17:0:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: warning: implicit declaration of function ‘KafkaError_new_from_error_destroy’; did you mean ‘cfl_PyErr_from_error_destroy’? [-Wimplicit-function-declaration]
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: note: in definition of macro ‘cfl_PyErr_from_error_destroy’
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: warning: initialization makes pointer from integer without a cast [-Wint-conversion]
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: note: in definition of macro ‘cfl_PyErr_from_error_destroy’
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c: At top level:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1340:1: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’; did you mean ‘rd_kafka_metadata_t’?
rd_kafka_consumer_group_metadata_t *py_to_c_cgmd (PyObject *obj) {
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_metadata_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c: In function ‘py_to_c_cgmd’:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1341:9: error: unknown type name ‘rd_kafka_consumer_group_metadata_t’; did you mean ‘rd_kafka_metadata_t’?
rd_kafka_consumer_group_metadata_t *cgmd;
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_metadata_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1342:9: error: unknown type name ‘rd_kafka_error_t’; did you mean ‘rd_kafka_event_t’?
rd_kafka_error_t *error;
^~~~~~~~~~~~~~~~
rd_kafka_event_t
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1349:17: warning: implicit declaration of function ‘rd_kafka_consumer_group_metadata_read’; did you mean ‘rd_kafka_consume_start_queue’? [-Wimplicit-function-declaration]
error = rd_kafka_consumer_group_metadata_read(&cgmd,
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
rd_kafka_consume_start_queue
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1349:15: warning: assignment makes pointer from integer without a cast [-Wint-conversion]
error = rd_kafka_consumer_group_metadata_read(&cgmd,
^
In file included from /tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:17:0:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: warning: initialization makes pointer from integer without a cast [-Wint-conversion]
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.h:207:19: note: in definition of macro ‘cfl_PyErr_from_error_destroy’
PyObject *_eo = KafkaError_new_from_error_destroy(error); \
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c: In function ‘error_cb’:
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1379:20: error: ‘RD_KAFKA_RESP_ERR__FATAL’ undeclared (first use in this function); did you mean ‘RD_KAFKA_RESP_ERR__FAIL’?
if (err == RD_KAFKA_RESP_ERR__FATAL) {
^~~~~~~~~~~~~~~~~~~~~~~~
RD_KAFKA_RESP_ERR__FAIL
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1379:20: note: each undeclared identifier is reported only once for each function it appears in
/tmp/pip-build-q8kme6th/confluent-kafka/src/confluent_kafka/src/confluent_kafka.c:1381:23: warning: implicit declaration of function ‘rd_kafka_fatal_error’; did you mean ‘rd_kafka_last_error’? [-Wimplicit-function-declaration]
err = rd_kafka_fatal_error(rk, errstr, sizeof(errstr));
^~~~~~~~~~~~~~~~~~~~
rd_kafka_last_error
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
----------------------------------------
Command "/home/me/git-github/datahub/venv/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-q8kme6th/confluent-kafka/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-wys0_d8e-record/install-record.txt --single-version-externally-managed --compile --install-headers /home/me/git-github/datahub/venv/include/site/python3.6/confluent-kafka" failed with error code 1 in /tmp/pip-build-q8kme6th/confluent-kafka/
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment