Skip to content

Instantly share code, notes, and snippets.

@yihua
Created November 30, 2021 01:45
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save yihua/12d02aec4174b657b2a8ac3cd7972a5a to your computer and use it in GitHub Desktop.
Save yihua/12d02aec4174b657b2a8ac3cd7972a5a to your computer and use it in GitHub Desktop.
Kafka connect with timeline server error
joyce@Joyces-MacBook-Pro ~/r/kafka-3.0.0-src [SIGINT]> ./bin/connect-distributed.sh $HUDI_DIR/hudi-kafka-connect/demo/connect-distributed.properties
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/core/build/dependant-libs-2.13.6/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/tools/build/dependant-libs-2.13.6/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/trogdor/build/dependant-libs-2.13.6/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/connect/api/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/connect/transforms/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/connect/runtime/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/connect/file/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/connect/mirror/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/connect/mirror-client/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/connect/json/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/joyce/repo/kafka-3.0.0-src/connect/basic-auth-extension/build/dependant-libs/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
[2021-11-29 17:40:07,059] INFO WorkerInfo values:
jvm.args = -Xms256M, -Xmx2G, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -XX:MaxInlineLevel=15, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/Users/joyce/repo/kafka-3.0.0-src/bin/../logs, -Dlog4j.configuration=file:./bin/../config/connect-log4j.properties
jvm.spec = AdoptOpenJDK, OpenJDK 64-Bit Server VM, 1.8.0_265, 25.265-b01
jvm.classpath = /Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/jopt-simple-5.0.4.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/jackson-dataformat-csv-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/kafka-metadata-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/argparse4j-0.7.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/jackson-core-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/scala-logging_2.13-3.9.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/kafka-storage-api-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/netty-codec-4.1.62.Final.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/scala-collection-compat_2.13-2.4.4.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/netty-transport-4.1.62.Final.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/jackson-module-scala_2.13-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/netty-common-4.1.62.Final.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/scala-reflect-2.13.6.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/kafka-storage-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/commons-cli-1.4.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/paranamer-2.8.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/scala-library-2.13.6.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/kafka-raft-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/metrics-core-2.2.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/zookeeper-3.6.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/netty-resolver-4.1.62.Final.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/netty-handler-4.1.62.Final.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/metrics-core-4.1.12.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/netty-transport-native-epoll-4.1.62.Final.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/audience-annotations-0.5.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/netty-transport-native-unix-common-4.1.62.Final.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/kafka-server-common-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/jackson-datatype-jdk8-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/scala-java8-compat_2.13-1.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/netty-buffer-4.1.62.Final.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/jackson-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/jackson-databind-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/dependant-libs-2.13.6/zookeeper-jute-3.6.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../examples/build/libs/kafka-examples-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../clients/build/libs/kafka-clients-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../streams/build/libs/kafka-streams-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../streams/examples/build/libs/kafka-streams-examples-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../streams/build/dependant-libs-2.13.6/rocksdbjni-6.19.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../shell/build/libs/kafka-shell-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../shell/build/dependant-libs-2.13.6/jline-3.12.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/libs/kafka-tools-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/jakarta.xml.bind-api-2.3.2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/argparse4j-0.7.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/jackson-core-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/jackson-jaxrs-base-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/jackson-jaxrs-json-provider-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/jakarta.activation-api-1.2.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/kafka-log4j-appender-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/jackson-datatype-jdk8-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/jackson-module-jaxb-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/jackson-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../tools/build/dependant-libs-2.13.6/jackson-databind-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/libs/trogdor-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jakarta.xml.bind-api-2.3.2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jetty-util-ajax-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jetty-continuation-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jersey-common-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/argparse4j-0.7.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/activation-1.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jackson-core-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jackson-jaxrs-base-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jetty-io-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/javax.servlet-api-3.1.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jersey-server-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/hk2-locator-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jakarta.ws.rs-api-2.1.6.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/aopalliance-repackaged-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jetty-servlets-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jackson-jaxrs-json-provider-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jetty-server-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jetty-util-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jaxb-api-2.3.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jakarta.activation-api-1.2.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jetty-servlet-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/osgi-resource-locator-1.0.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jersey-hk2-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jakarta.inject-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/hk2-utils-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/kafka-log4j-appender-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jackson-datatype-jdk8-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jersey-container-servlet-core-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/hk2-api-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jersey-container-servlet-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jakarta.annotation-api-1.3.5.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jackson-module-jaxb-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jetty-http-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jersey-client-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jetty-security-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/javassist-3.27.0-GA.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jakarta.validation-api-2.0.2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jackson-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../trogdor/build/dependant-libs-2.13.6/jackson-databind-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/api/build/libs/connect-api-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/api/build/dependant-libs/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/api/build/dependant-libs/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/api/build/dependant-libs/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/api/build/dependant-libs/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/api/build/dependant-libs/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/api/build/dependant-libs/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/api/build/dependant-libs/javax.ws.rs-api-2.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/transforms/build/libs/connect-transforms-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/transforms/build/dependant-libs/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/transforms/build/dependant-libs/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/transforms/build/dependant-libs/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/transforms/build/dependant-libs/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/transforms/build/dependant-libs/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/transforms/build/dependant-libs/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/transforms/build/dependant-libs/javax.ws.rs-api-2.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/libs/connect-runtime-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jakarta.xml.bind-api-2.3.2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-util-ajax-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-continuation-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jersey-common-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/reflections-0.9.12.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/argparse4j-0.7.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/activation-1.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jackson-core-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jackson-jaxrs-base-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-io-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/javax.servlet-api-3.1.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jersey-server-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/hk2-locator-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/plexus-utils-3.2.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jakarta.ws.rs-api-2.1.6.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/aopalliance-repackaged-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-servlets-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jackson-jaxrs-json-provider-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-server-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-util-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jaxb-api-2.3.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jakarta.activation-api-1.2.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-servlet-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/osgi-resource-locator-1.0.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jersey-hk2-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jakarta.inject-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/hk2-utils-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/javax.ws.rs-api-2.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/kafka-log4j-appender-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jackson-datatype-jdk8-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jersey-container-servlet-core-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/hk2-api-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jersey-container-servlet-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jakarta.annotation-api-1.3.5.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/maven-artifact-3.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jackson-module-jaxb-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-http-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jersey-client-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-client-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jetty-security-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/kafka-tools-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/javassist-3.27.0-GA.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jakarta.validation-api-2.0.2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/commons-lang3-3.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jackson-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/runtime/build/dependant-libs/jackson-databind-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/file/build/libs/connect-file-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/file/build/dependant-libs/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/file/build/dependant-libs/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/file/build/dependant-libs/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/file/build/dependant-libs/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/file/build/dependant-libs/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/file/build/dependant-libs/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/file/build/dependant-libs/javax.ws.rs-api-2.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/libs/connect-mirror-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jakarta.xml.bind-api-2.3.2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-util-ajax-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-continuation-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jersey-common-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/reflections-0.9.12.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/argparse4j-0.7.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/activation-1.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jackson-core-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jackson-jaxrs-base-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-io-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/javax.servlet-api-3.1.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jersey-server-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/hk2-locator-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/plexus-utils-3.2.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jakarta.ws.rs-api-2.1.6.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/aopalliance-repackaged-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-servlets-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jackson-jaxrs-json-provider-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-server-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-util-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jaxb-api-2.3.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jakarta.activation-api-1.2.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-servlet-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/osgi-resource-locator-1.0.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jersey-hk2-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jakarta.inject-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/hk2-utils-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/javax.ws.rs-api-2.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/kafka-log4j-appender-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jackson-datatype-jdk8-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jersey-container-servlet-core-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/hk2-api-2.6.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jersey-container-servlet-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jakarta.annotation-api-1.3.5.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/maven-artifact-3.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jackson-module-jaxb-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-http-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jersey-client-2.34.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-client-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jetty-security-9.4.43.v20210629.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/kafka-tools-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/javassist-3.27.0-GA.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jakarta.validation-api-2.0.2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/commons-lang3-3.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jackson-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror/build/dependant-libs/jackson-databind-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror-client/build/libs/connect-mirror-client-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror-client/build/dependant-libs/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror-client/build/dependant-libs/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror-client/build/dependant-libs/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror-client/build/dependant-libs/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror-client/build/dependant-libs/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/mirror-client/build/dependant-libs/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/libs/connect-json-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/jackson-core-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/javax.ws.rs-api-2.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/jackson-datatype-jdk8-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/jackson-annotations-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/json/build/dependant-libs/jackson-databind-2.12.3.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/basic-auth-extension/build/libs/connect-basic-auth-extension-3.0.0.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/basic-auth-extension/build/dependant-libs/slf4j-log4j12-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/basic-auth-extension/build/dependant-libs/snappy-java-1.1.8.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/basic-auth-extension/build/dependant-libs/slf4j-api-1.7.30.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/basic-auth-extension/build/dependant-libs/log4j-1.2.17.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/basic-auth-extension/build/dependant-libs/zstd-jni-1.5.0-2.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/basic-auth-extension/build/dependant-libs/lz4-java-1.7.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../connect/basic-auth-extension/build/dependant-libs/javax.ws.rs-api-2.1.1.jar:/Users/joyce/repo/kafka-3.0.0-src/bin/../core/build/libs/kafka_2.13-3.0.0.jar
os.spec = Mac OS X, x86_64, 10.16
os.vcpus = 12
(org.apache.kafka.connect.runtime.WorkerInfo:71)
[2021-11-29 17:40:07,072] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectDistributed:92)
[2021-11-29 17:40:07,097] INFO Loading plugin from: /usr/local/share/kafka/plugins/lib (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:265)
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. Set system property 'org.apache.logging.log4j.simplelog.StatusLogger.level' to TRACE to show Log4j2 internal initialization logging.
[2021-11-29 17:40:12,454] INFO Registered loader: PluginClassLoader{pluginLocation=file:/usr/local/share/kafka/plugins/lib/} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:288)
[2021-11-29 17:40:12,454] INFO Added plugin 'io.confluent.connect.hdfs.HdfsSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:12,454] INFO Added plugin 'io.confluent.connect.storage.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:12,455] INFO Added plugin 'org.apache.hudi.connect.HoodieSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:12,455] INFO Added plugin 'io.confluent.connect.hdfs.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:12,455] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:12,455] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:12,455] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,595] INFO Registered loader: sun.misc.Launcher$AppClassLoader@764c12b6 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:288)
[2021-11-29 17:40:13,595] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,596] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,596] INFO Added plugin 'org.apache.kafka.connect.tools.SchemaSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,596] INFO Added plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,597] INFO Added plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,597] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,597] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,597] INFO Added plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,597] INFO Added plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,597] INFO Added plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,597] INFO Added plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,597] INFO Added plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,598] INFO Added plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,598] INFO Added plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,598] INFO Added plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,598] INFO Added plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,598] INFO Added plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,598] INFO Added plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,598] INFO Added plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,598] INFO Added plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,599] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,599] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,599] INFO Added plugin 'org.apache.kafka.connect.transforms.ReplaceField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,599] INFO Added plugin 'org.apache.kafka.connect.transforms.Filter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,599] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,599] INFO Added plugin 'org.apache.kafka.connect.transforms.HeaderFrom$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,599] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,599] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.MaskField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.DropHeaders' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.Cast$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.runtime.PredicatedTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,600] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,601] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertHeader' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,601] INFO Added plugin 'org.apache.kafka.connect.transforms.InsertField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,601] INFO Added plugin 'org.apache.kafka.connect.transforms.Flatten$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,601] INFO Added plugin 'org.apache.kafka.connect.transforms.HeaderFrom$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,601] INFO Added plugin 'org.apache.kafka.connect.transforms.SetSchemaMetadata$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,601] INFO Added plugin 'org.apache.kafka.connect.transforms.ExtractField$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,601] INFO Added plugin 'org.apache.kafka.connect.transforms.TimestampConverter$Value' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,601] INFO Added plugin 'org.apache.kafka.connect.transforms.HoistField$Key' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,602] INFO Added plugin 'org.apache.kafka.connect.transforms.predicates.HasHeaderKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,602] INFO Added plugin 'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,602] INFO Added plugin 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,602] INFO Added plugin 'org.apache.kafka.common.config.provider.FileConfigProvider' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,602] INFO Added plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:217)
[2021-11-29 17:40:13,603] INFO Added aliases 'HdfsSinkConnector' and 'HdfsSink' to plugin 'io.confluent.connect.hdfs.HdfsSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,603] INFO Added aliases 'HoodieSinkConnector' and 'HoodieSink' to plugin 'org.apache.hudi.connect.HoodieSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,603] INFO Added aliases 'FileStreamSinkConnector' and 'FileStreamSink' to plugin 'org.apache.kafka.connect.file.FileStreamSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,604] INFO Added aliases 'FileStreamSourceConnector' and 'FileStreamSource' to plugin 'org.apache.kafka.connect.file.FileStreamSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,604] INFO Added aliases 'MirrorCheckpointConnector' and 'MirrorCheckpoint' to plugin 'org.apache.kafka.connect.mirror.MirrorCheckpointConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,604] INFO Added aliases 'MirrorHeartbeatConnector' and 'MirrorHeartbeat' to plugin 'org.apache.kafka.connect.mirror.MirrorHeartbeatConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,604] INFO Added aliases 'MirrorSourceConnector' and 'MirrorSource' to plugin 'org.apache.kafka.connect.mirror.MirrorSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,604] INFO Added aliases 'MockConnector' and 'Mock' to plugin 'org.apache.kafka.connect.tools.MockConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,604] INFO Added aliases 'MockSinkConnector' and 'MockSink' to plugin 'org.apache.kafka.connect.tools.MockSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,605] INFO Added aliases 'MockSourceConnector' and 'MockSource' to plugin 'org.apache.kafka.connect.tools.MockSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,605] INFO Added aliases 'VerifiableSinkConnector' and 'VerifiableSink' to plugin 'org.apache.kafka.connect.tools.VerifiableSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,605] INFO Added aliases 'VerifiableSourceConnector' and 'VerifiableSource' to plugin 'org.apache.kafka.connect.tools.VerifiableSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,605] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,605] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,605] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,605] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,606] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,606] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,606] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,606] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,606] INFO Added aliases 'ByteArrayConverter' and 'ByteArray' to plugin 'org.apache.kafka.connect.converters.ByteArrayConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,606] INFO Added aliases 'DoubleConverter' and 'Double' to plugin 'org.apache.kafka.connect.converters.DoubleConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,607] INFO Added aliases 'FloatConverter' and 'Float' to plugin 'org.apache.kafka.connect.converters.FloatConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,607] INFO Added aliases 'IntegerConverter' and 'Integer' to plugin 'org.apache.kafka.connect.converters.IntegerConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,607] INFO Added aliases 'LongConverter' and 'Long' to plugin 'org.apache.kafka.connect.converters.LongConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,607] INFO Added aliases 'ShortConverter' and 'Short' to plugin 'org.apache.kafka.connect.converters.ShortConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,607] INFO Added aliases 'JsonConverter' and 'Json' to plugin 'org.apache.kafka.connect.json.JsonConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,607] INFO Added alias 'SimpleHeaderConverter' to plugin 'org.apache.kafka.connect.storage.SimpleHeaderConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,608] INFO Added aliases 'StringConverter' and 'String' to plugin 'org.apache.kafka.connect.storage.StringConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,608] INFO Added aliases 'PredicatedTransformation' and 'Predicated' to plugin 'org.apache.kafka.connect.runtime.PredicatedTransformation' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,608] INFO Added alias 'DropHeaders' to plugin 'org.apache.kafka.connect.transforms.DropHeaders' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,609] INFO Added alias 'Filter' to plugin 'org.apache.kafka.connect.transforms.Filter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,609] INFO Added alias 'InsertHeader' to plugin 'org.apache.kafka.connect.transforms.InsertHeader' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,609] INFO Added alias 'RegexRouter' to plugin 'org.apache.kafka.connect.transforms.RegexRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,609] INFO Added alias 'TimestampRouter' to plugin 'org.apache.kafka.connect.transforms.TimestampRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,610] INFO Added alias 'ValueToKey' to plugin 'org.apache.kafka.connect.transforms.ValueToKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,610] INFO Added alias 'HasHeaderKey' to plugin 'org.apache.kafka.connect.transforms.predicates.HasHeaderKey' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,610] INFO Added alias 'RecordIsTombstone' to plugin 'org.apache.kafka.connect.transforms.predicates.RecordIsTombstone' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,610] INFO Added alias 'TopicNameMatches' to plugin 'org.apache.kafka.connect.transforms.predicates.TopicNameMatches' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,610] INFO Added alias 'BasicAuthSecurityRestExtension' to plugin 'org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:459)
[2021-11-29 17:40:13,610] INFO Added aliases 'AllConnectorClientConfigOverridePolicy' and 'All' to plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,611] INFO Added aliases 'NoneConnectorClientConfigOverridePolicy' and 'None' to plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,611] INFO Added aliases 'PrincipalConnectorClientConfigOverridePolicy' and 'Principal' to plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:462)
[2021-11-29 17:40:13,665] INFO DistributedConfig values:
access.control.allow.methods =
access.control.allow.origin =
admin.listeners = null
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
config.providers = []
config.storage.replication.factor = 1
config.storage.topic = connect-configs
connect.protocol = sessioned
connections.max.idle.ms = 540000
connector.client.config.override.policy = All
group.id = hudi-connect-cluster
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
heartbeat.interval.ms = 3000
inter.worker.key.generation.algorithm = HmacSHA256
inter.worker.key.size = null
inter.worker.key.ttl.ms = 3600000
inter.worker.signature.algorithm = HmacSHA256
inter.worker.verification.algorithms = [HmacSHA256]
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = [HTTP://:8083]
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.partitions = 25
offset.storage.replication.factor = 1
offset.storage.topic = connect-offsets
plugin.path = [/usr/local/share/kafka/plugins]
rebalance.timeout.ms = 60000
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 40000
response.http.headers.config =
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
scheduled.rebalance.max.delay.ms = 300000
security.protocol = PLAINTEXT
send.buffer.bytes = 131072
session.timeout.ms = 10000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.client.auth = none
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
status.storage.partitions = 5
status.storage.replication.factor = 1
status.storage.topic = connect-status
task.shutdown.graceful.timeout.ms = 5000
topic.creation.enable = true
topic.tracking.allow.reset = true
topic.tracking.enable = true
value.converter = class org.apache.kafka.connect.json.JsonConverter
worker.sync.timeout.ms = 3000
worker.unsync.backoff.ms = 300000
(org.apache.kafka.connect.runtime.distributed.DistributedConfig:376)
[2021-11-29 17:40:13,667] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:49)
[2021-11-29 17:40:13,671] INFO AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:40:13,752] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,752] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,752] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,752] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,752] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,752] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,752] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,752] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,752] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,753] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,753] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,753] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,753] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,753] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:13,753] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:13,753] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:13,754] INFO Kafka startTimeMs: 1638236413753 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,101] INFO Kafka cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.connect.util.ConnectUtils:65)
[2021-11-29 17:40:14,103] INFO App info kafka.admin.client for adminclient-1 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2021-11-29 17:40:14,112] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2021-11-29 17:40:14,112] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2021-11-29 17:40:14,112] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2021-11-29 17:40:14,126] INFO Logging initialized @7600ms to org.eclipse.jetty.util.log.Slf4jLog (org.eclipse.jetty.util.log:170)
[2021-11-29 17:40:14,181] INFO Added connector for HTTP://:8083 (org.apache.kafka.connect.runtime.rest.RestServer:117)
[2021-11-29 17:40:14,181] INFO Initializing REST server (org.apache.kafka.connect.runtime.rest.RestServer:188)
[2021-11-29 17:40:14,189] INFO jetty-9.4.43.v20210629; built: 2021-06-30T11:07:22.254Z; git: 526006ecfa3af7f1a27ef3a288e2bef7ea9dd7e8; jvm 1.8.0_265-b01 (org.eclipse.jetty.server.Server:375)
[2021-11-29 17:40:14,215] INFO Started http_8083@3a3316b6{HTTP/1.1, (http/1.1)}{0.0.0.0:8083} (org.eclipse.jetty.server.AbstractConnector:331)
[2021-11-29 17:40:14,216] INFO Started @7691ms (org.eclipse.jetty.server.Server:415)
[2021-11-29 17:40:14,238] INFO Advertised URI: http://127.0.0.1:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:355)
[2021-11-29 17:40:14,238] INFO REST server listening at http://127.0.0.1:8083/, advertising URL http://127.0.0.1:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:203)
[2021-11-29 17:40:14,238] INFO Advertised URI: http://127.0.0.1:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:355)
[2021-11-29 17:40:14,238] INFO REST admin endpoints at http://127.0.0.1:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:204)
[2021-11-29 17:40:14,239] INFO Advertised URI: http://127.0.0.1:8083/ (org.apache.kafka.connect.runtime.rest.RestServer:355)
[2021-11-29 17:40:14,242] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:49)
[2021-11-29 17:40:14,243] INFO AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:40:14,247] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,247] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,247] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,247] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,247] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,248] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,248] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,249] INFO Kafka startTimeMs: 1638236414248 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,257] INFO Kafka cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.connect.util.ConnectUtils:65)
[2021-11-29 17:40:14,257] INFO App info kafka.admin.client for adminclient-2 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2021-11-29 17:40:14,260] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2021-11-29 17:40:14,260] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2021-11-29 17:40:14,260] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2021-11-29 17:40:14,265] INFO Setting up All Policy for ConnectorClientConfigOverride. This will allow all client configurations to be overridden (org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy:44)
[2021-11-29 17:40:14,274] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:49)
[2021-11-29 17:40:14,274] INFO AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:40:14,277] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,277] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,277] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,278] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,279] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,279] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,279] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,279] INFO Kafka startTimeMs: 1638236414279 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,287] INFO Kafka cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.connect.util.ConnectUtils:65)
[2021-11-29 17:40:14,288] INFO App info kafka.admin.client for adminclient-3 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2021-11-29 17:40:14,290] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2021-11-29 17:40:14,290] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2021-11-29 17:40:14,290] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2021-11-29 17:40:14,294] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,294] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,294] INFO Kafka startTimeMs: 1638236414294 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,437] INFO JsonConverterConfig values:
converter.type = key
decimal.format = BASE64
schemas.cache.size = 1000
schemas.enable = false
(org.apache.kafka.connect.json.JsonConverterConfig:376)
[2021-11-29 17:40:14,439] INFO JsonConverterConfig values:
converter.type = value
decimal.format = BASE64
schemas.cache.size = 1000
schemas.enable = false
(org.apache.kafka.connect.json.JsonConverterConfig:376)
[2021-11-29 17:40:14,439] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:49)
[2021-11-29 17:40:14,439] INFO AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:40:14,442] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,442] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,442] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,442] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,442] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,443] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,444] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,444] INFO Kafka startTimeMs: 1638236414443 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,452] INFO Kafka cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.connect.util.ConnectUtils:65)
[2021-11-29 17:40:14,453] INFO App info kafka.admin.client for adminclient-4 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2021-11-29 17:40:14,455] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2021-11-29 17:40:14,455] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2021-11-29 17:40:14,455] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2021-11-29 17:40:14,465] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:49)
[2021-11-29 17:40:14,465] INFO AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:40:14,469] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,469] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,470] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,470] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,470] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,470] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,470] INFO Kafka startTimeMs: 1638236414470 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,480] INFO Kafka cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.connect.util.ConnectUtils:65)
[2021-11-29 17:40:14,481] INFO App info kafka.admin.client for adminclient-5 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2021-11-29 17:40:14,483] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2021-11-29 17:40:14,483] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2021-11-29 17:40:14,483] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2021-11-29 17:40:14,487] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:49)
[2021-11-29 17:40:14,488] INFO AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:40:14,491] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,491] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,491] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,491] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,491] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,491] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,491] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,491] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,492] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,492] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,492] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,492] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,492] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,492] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,492] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,492] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,492] INFO Kafka startTimeMs: 1638236414492 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,499] INFO Kafka cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.connect.util.ConnectUtils:65)
[2021-11-29 17:40:14,500] INFO App info kafka.admin.client for adminclient-6 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2021-11-29 17:40:14,502] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2021-11-29 17:40:14,502] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2021-11-29 17:40:14,502] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2021-11-29 17:40:14,521] INFO Creating Kafka admin client (org.apache.kafka.connect.util.ConnectUtils:49)
[2021-11-29 17:40:14,521] INFO AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:40:14,523] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,523] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,523] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,523] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,523] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,523] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,523] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,523] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,523] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,523] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,524] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,524] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,524] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,524] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,524] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,524] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,524] INFO Kafka startTimeMs: 1638236414524 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,532] INFO Kafka cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.connect.util.ConnectUtils:65)
[2021-11-29 17:40:14,532] INFO App info kafka.admin.client for adminclient-7 unregistered (org.apache.kafka.common.utils.AppInfoParser:83)
[2021-11-29 17:40:14,535] INFO Metrics scheduler closed (org.apache.kafka.common.metrics.Metrics:659)
[2021-11-29 17:40:14,535] INFO Closing reporter org.apache.kafka.common.metrics.JmxReporter (org.apache.kafka.common.metrics.Metrics:663)
[2021-11-29 17:40:14,535] INFO Metrics reporters closed (org.apache.kafka.common.metrics.Metrics:669)
[2021-11-29 17:40:14,558] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,558] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,558] INFO Kafka startTimeMs: 1638236414558 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,561] INFO Kafka Connect distributed worker initialization took 7489ms (org.apache.kafka.connect.cli.ConnectDistributed:138)
[2021-11-29 17:40:14,561] INFO Kafka Connect starting (org.apache.kafka.connect.runtime.Connect:51)
[2021-11-29 17:40:14,562] INFO Initializing REST resources (org.apache.kafka.connect.runtime.rest.RestServer:208)
[2021-11-29 17:40:14,562] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Herder starting (org.apache.kafka.connect.runtime.distributed.DistributedHerder:318)
[2021-11-29 17:40:14,563] INFO Worker starting (org.apache.kafka.connect.runtime.Worker:189)
[2021-11-29 17:40:14,563] INFO Starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:144)
[2021-11-29 17:40:14,563] INFO Starting KafkaBasedLog with topic connect-offsets (org.apache.kafka.connect.util.KafkaBasedLog:162)
[2021-11-29 17:40:14,563] INFO AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:40:14,565] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,565] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,566] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,567] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,567] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,567] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,567] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.admin.AdminClientConfig:384)
[2021-11-29 17:40:14,567] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,567] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,567] INFO Kafka startTimeMs: 1638236414567 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,616] INFO Adding admin resources to main listener (org.apache.kafka.connect.runtime.rest.RestServer:225)
[2021-11-29 17:40:14,619] INFO ProducerConfig values:
acks = -1
batch.size = 16384
bootstrap.servers = [kafkabroker:9092]
buffer.memory = 33554432
client.dns.lookup = use_all_dns_ips
client.id = producer-1
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 2147483647
enable.idempotence = true
interceptor.classes = []
key.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 1
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
(org.apache.kafka.clients.producer.ProducerConfig:376)
[2021-11-29 17:40:14,640] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,641] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,641] WARN The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,641] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,641] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,641] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,641] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,641] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,642] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,642] WARN The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,642] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,642] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,642] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,642] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,643] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,643] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,643] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,643] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,643] INFO Kafka startTimeMs: 1638236414643 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,649] INFO [Producer clientId=producer-1] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:14,654] INFO ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafkabroker:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = consumer-hudi-connect-cluster-1
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = hudi-connect-cluster
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2021-11-29 17:40:14,686] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,686] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,686] WARN The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,686] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,686] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,686] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,686] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,686] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,686] WARN The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,687] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,687] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,687] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,687] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,687] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,687] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,687] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,687] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,687] INFO Kafka startTimeMs: 1638236414687 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,694] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:14,700] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Subscribed to partition(s): connect-offsets-0, connect-offsets-5, connect-offsets-10, connect-offsets-20, connect-offsets-15, connect-offsets-9, connect-offsets-11, connect-offsets-4, connect-offsets-16, connect-offsets-17, connect-offsets-3, connect-offsets-24, connect-offsets-23, connect-offsets-13, connect-offsets-18, connect-offsets-22, connect-offsets-8, connect-offsets-2, connect-offsets-12, connect-offsets-19, connect-offsets-14, connect-offsets-1, connect-offsets-6, connect-offsets-7, connect-offsets-21 (org.apache.kafka.clients.consumer.KafkaConsumer:1121)
[2021-11-29 17:40:14,706] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-0 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,707] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-5 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,707] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-10 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,707] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-20 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,707] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-15 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,707] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-9 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,707] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-11 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,708] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-4 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,708] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-16 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,708] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-17 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,708] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-3 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,708] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-24 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,708] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-23 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,708] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-13 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,708] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-18 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,708] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-22 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,709] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-8 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,709] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-2 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,709] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-12 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,709] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-19 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,709] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-14 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,709] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-1 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,709] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-6 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,710] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-7 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,710] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-offsets-21 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,716] INFO DefaultSessionIdManager workerName=node0 (org.eclipse.jetty.server.session:334)
[2021-11-29 17:40:14,716] INFO No SessionScavenger set, using defaults (org.eclipse.jetty.server.session:339)
[2021-11-29 17:40:14,718] INFO node0 Scavenging every 660000ms (org.eclipse.jetty.server.session:132)
[2021-11-29 17:40:14,754] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-10 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,755] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-8 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,756] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-14 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,756] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-12 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,756] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-2 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,756] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,757] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-6 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,757] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-4 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,757] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-24 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,757] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-18 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,758] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-16 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,758] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-22 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,758] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-20 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,758] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-9 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,758] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-7 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,759] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-13 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,759] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-11 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,759] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-1 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,759] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-5 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,759] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-3 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,760] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-23 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,760] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-17 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,760] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-15 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,761] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-21 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,761] INFO [Consumer clientId=consumer-hudi-connect-cluster-1, groupId=hudi-connect-cluster] Resetting offset for partition connect-offsets-19 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,761] INFO Finished reading KafkaBasedLog for topic connect-offsets (org.apache.kafka.connect.util.KafkaBasedLog:202)
[2021-11-29 17:40:14,762] INFO Started KafkaBasedLog for topic connect-offsets (org.apache.kafka.connect.util.KafkaBasedLog:204)
[2021-11-29 17:40:14,763] INFO Finished reading offsets topic and starting KafkaOffsetBackingStore (org.apache.kafka.connect.storage.KafkaOffsetBackingStore:146)
[2021-11-29 17:40:14,765] INFO Worker started (org.apache.kafka.connect.runtime.Worker:196)
[2021-11-29 17:40:14,765] INFO Starting KafkaBasedLog with topic connect-status (org.apache.kafka.connect.util.KafkaBasedLog:162)
[2021-11-29 17:40:14,776] INFO ProducerConfig values:
acks = -1
batch.size = 16384
bootstrap.servers = [kafkabroker:9092]
buffer.memory = 33554432
client.dns.lookup = use_all_dns_ips
client.id = producer-2
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 120000
enable.idempotence = true
interceptor.classes = []
key.serializer = class org.apache.kafka.common.serialization.StringSerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 1
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 0
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
(org.apache.kafka.clients.producer.ProducerConfig:376)
[2021-11-29 17:40:14,780] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,781] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,781] WARN The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,782] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,782] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,783] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,784] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,785] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,785] INFO Kafka startTimeMs: 1638236414784 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,785] INFO ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafkabroker:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = consumer-hudi-connect-cluster-2
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = hudi-connect-cluster
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2021-11-29 17:40:14,786] INFO [Producer clientId=producer-2] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:14,790] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,790] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,790] WARN The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,790] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,790] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,791] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,791] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,791] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,791] WARN The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,791] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,791] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,791] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,792] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,792] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,792] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,792] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,792] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,793] INFO Kafka startTimeMs: 1638236414792 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,797] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:14,797] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Subscribed to partition(s): connect-status-0, connect-status-4, connect-status-1, connect-status-2, connect-status-3 (org.apache.kafka.clients.consumer.KafkaConsumer:1121)
[2021-11-29 17:40:14,798] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-status-0 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,798] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-status-4 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,798] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-status-1 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,798] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-status-2 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,798] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-status-3 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,810] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Resetting offset for partition connect-status-1 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,810] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Resetting offset for partition connect-status-2 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,810] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Resetting offset for partition connect-status-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,811] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Resetting offset for partition connect-status-3 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,811] INFO [Consumer clientId=consumer-hudi-connect-cluster-2, groupId=hudi-connect-cluster] Resetting offset for partition connect-status-4 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,860] INFO Finished reading KafkaBasedLog for topic connect-status (org.apache.kafka.connect.util.KafkaBasedLog:202)
[2021-11-29 17:40:14,861] INFO Started KafkaBasedLog for topic connect-status (org.apache.kafka.connect.util.KafkaBasedLog:204)
[2021-11-29 17:40:14,866] INFO Starting KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore:291)
[2021-11-29 17:40:14,866] INFO Starting KafkaBasedLog with topic connect-configs (org.apache.kafka.connect.util.KafkaBasedLog:162)
[2021-11-29 17:40:14,877] INFO ProducerConfig values:
acks = -1
batch.size = 16384
bootstrap.servers = [kafkabroker:9092]
buffer.memory = 33554432
client.dns.lookup = use_all_dns_ips
client.id = producer-3
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 2147483647
enable.idempotence = true
interceptor.classes = []
key.serializer = class org.apache.kafka.common.serialization.StringSerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 1
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
(org.apache.kafka.clients.producer.ProducerConfig:376)
[2021-11-29 17:40:14,881] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,881] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,881] WARN The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,882] WARN The configuration 'group.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,882] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,882] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,882] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,882] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,882] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,882] WARN The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,882] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,882] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,883] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,883] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,883] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,883] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.producer.ProducerConfig:384)
[2021-11-29 17:40:14,883] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,883] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,883] INFO Kafka startTimeMs: 1638236414883 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,884] INFO ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafkabroker:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = consumer-hudi-connect-cluster-3
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = hudi-connect-cluster
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2021-11-29 17:40:14,886] INFO [Producer clientId=producer-3] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:14,888] WARN The configuration 'config.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,888] WARN The configuration 'listeners' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'status.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'plugin.path' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'config.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'offset.flush.interval.ms' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'key.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'status.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'value.converter.schemas.enable' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'offset.storage.replication.factor' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'offset.storage.topic' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'value.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,889] WARN The configuration 'key.converter' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:14,890] INFO Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:14,890] INFO Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:14,890] INFO Kafka startTimeMs: 1638236414890 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:14,894] INFO [Consumer clientId=consumer-hudi-connect-cluster-3, groupId=hudi-connect-cluster] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:14,894] INFO [Consumer clientId=consumer-hudi-connect-cluster-3, groupId=hudi-connect-cluster] Subscribed to partition(s): connect-configs-0 (org.apache.kafka.clients.consumer.KafkaConsumer:1121)
[2021-11-29 17:40:14,895] INFO [Consumer clientId=consumer-hudi-connect-cluster-3, groupId=hudi-connect-cluster] Seeking to EARLIEST offset of partition connect-configs-0 (org.apache.kafka.clients.consumer.internals.SubscriptionState:641)
[2021-11-29 17:40:14,903] INFO [Consumer clientId=consumer-hudi-connect-cluster-3, groupId=hudi-connect-cluster] Resetting offset for partition connect-configs-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:14,909] INFO Successfully processed removal of connector 'hudi-sink' (org.apache.kafka.connect.storage.KafkaConfigBackingStore:630)
[2021-11-29 17:40:14,910] INFO Finished reading KafkaBasedLog for topic connect-configs (org.apache.kafka.connect.util.KafkaBasedLog:202)
[2021-11-29 17:40:14,910] INFO Started KafkaBasedLog for topic connect-configs (org.apache.kafka.connect.util.KafkaBasedLog:204)
[2021-11-29 17:40:14,910] INFO Started KafkaConfigBackingStore (org.apache.kafka.connect.storage.KafkaConfigBackingStore:306)
[2021-11-29 17:40:14,911] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Herder started (org.apache.kafka.connect.runtime.distributed.DistributedHerder:322)
[2021-11-29 17:40:14,919] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:14,920] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Discovered group coordinator localhost:9092 (id: 2147483647 rack: null) (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:849)
[2021-11-29 17:40:14,922] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222)
[2021-11-29 17:40:14,922] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] (Re-)joining group (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:535)
[2021-11-29 17:40:14,931] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] (Re-)joining group (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:535)
[2021-11-29 17:40:14,934] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Successfully joined group with generation Generation{generationId=7, memberId='connect-1-58f0f3b0-d87f-4ddf-bee5-5312b2dad26e', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:591)
[2021-11-29 17:40:14,959] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Successfully synced group in generation Generation{generationId=7, memberId='connect-1-58f0f3b0-d87f-4ddf-bee5-5312b2dad26e', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:757)
[2021-11-29 17:40:14,960] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Joined group at generation 7 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-58f0f3b0-d87f-4ddf-bee5-5312b2dad26e', leaderUrl='http://127.0.0.1:8083/', offset=9, connectorIds=[], taskIds=[], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1848)
[2021-11-29 17:40:14,960] WARN [Worker clientId=connect-1, groupId=hudi-connect-cluster] Catching up to assignment's config offset. (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1253)
[2021-11-29 17:40:14,960] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Current config state offset -1 is behind group assignment 9, reading to end of config log (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1317)
[2021-11-29 17:40:14,963] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Finished reading to end of log and updated config snapshot, new config log offset: 9 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1324)
[2021-11-29 17:40:14,964] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Starting connectors and tasks using config offset 9 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1378)
[2021-11-29 17:40:14,964] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1406)
[2021-11-29 17:40:15,010] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Session key updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1716)
Nov 29, 2021 5:40:15 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.RootResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.RootResource will be ignored.
Nov 29, 2021 5:40:15 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource will be ignored.
Nov 29, 2021 5:40:15 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource will be ignored.
Nov 29, 2021 5:40:15 PM org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
WARNING: A provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. Due to constraint configuration problems the provider org.apache.kafka.connect.runtime.rest.resources.LoggingResource will be ignored.
Nov 29, 2021 5:40:15 PM org.glassfish.jersey.internal.Errors logErrors
WARNING: The following warnings have been detected: WARNING: The (sub)resource method listLoggers in org.apache.kafka.connect.runtime.rest.resources.LoggingResource contains empty path annotation.
WARNING: The (sub)resource method createConnector in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectors in org.apache.kafka.connect.runtime.rest.resources.ConnectorsResource contains empty path annotation.
WARNING: The (sub)resource method listConnectorPlugins in org.apache.kafka.connect.runtime.rest.resources.ConnectorPluginsResource contains empty path annotation.
WARNING: The (sub)resource method serverInfo in org.apache.kafka.connect.runtime.rest.resources.RootResource contains empty path annotation.
[2021-11-29 17:40:15,294] INFO Started o.e.j.s.ServletContextHandler@c0004b7{/,null,AVAILABLE} (org.eclipse.jetty.server.handler.ContextHandler:915)
[2021-11-29 17:40:15,294] INFO REST resources initialized; server is started and ready to handle requests (org.apache.kafka.connect.runtime.rest.RestServer:303)
[2021-11-29 17:40:15,294] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:57)
[2021-11-29 17:40:18,048] INFO AbstractConfig values:
(org.apache.kafka.common.config.AbstractConfig:376)
[2021-11-29 17:40:18,060] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Connector hudi-sink config updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1672)
[2021-11-29 17:40:18,062] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222)
[2021-11-29 17:40:18,062] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] (Re-)joining group (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:535)
[2021-11-29 17:40:18,065] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Successfully joined group with generation Generation{generationId=8, memberId='connect-1-58f0f3b0-d87f-4ddf-bee5-5312b2dad26e', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:591)
[2021-11-29 17:40:18,071] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Successfully synced group in generation Generation{generationId=8, memberId='connect-1-58f0f3b0-d87f-4ddf-bee5-5312b2dad26e', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:757)
[2021-11-29 17:40:18,071] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Joined group at generation 8 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-58f0f3b0-d87f-4ddf-bee5-5312b2dad26e', leaderUrl='http://127.0.0.1:8083/', offset=11, connectorIds=[hudi-sink], taskIds=[], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1848)
[2021-11-29 17:40:18,072] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Starting connectors and tasks using config offset 11 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1378)
[2021-11-29 17:40:18,073] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Starting connector hudi-sink (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1455)
[2021-11-29 17:40:18,079] INFO [hudi-sink|worker] Creating connector hudi-sink of type org.apache.hudi.connect.HoodieSinkConnector (org.apache.kafka.connect.runtime.Worker:268)
[2021-11-29 17:40:18,079] INFO [hudi-sink|worker] SinkConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.SinkConnectorConfig:376)
[2021-11-29 17:40:18,080] INFO [hudi-sink|worker] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,087] INFO [hudi-sink|worker] Instantiated connector hudi-sink with version 0.1.0 of type class org.apache.hudi.connect.HoodieSinkConnector (org.apache.kafka.connect.runtime.Worker:278)
[2021-11-29 17:40:18,087] INFO [hudi-sink|worker] Finished creating connector hudi-sink (org.apache.kafka.connect.runtime.Worker:304)
[2021-11-29 17:40:18,089] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1406)
[2021-11-29 17:40:18,102] INFO SinkConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.SinkConnectorConfig:376)
[2021-11-29 17:40:18,103] INFO EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,130] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Tasks [hudi-sink-0, hudi-sink-1, hudi-sink-2, hudi-sink-3] configs updated (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1687)
[2021-11-29 17:40:18,132] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Handling task config update by restarting tasks [] (org.apache.kafka.connect.runtime.distributed.DistributedHerder:687)
[2021-11-29 17:40:18,132] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Rebalance started (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:222)
[2021-11-29 17:40:18,133] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] (Re-)joining group (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:535)
[2021-11-29 17:40:18,135] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Successfully joined group with generation Generation{generationId=9, memberId='connect-1-58f0f3b0-d87f-4ddf-bee5-5312b2dad26e', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:591)
[2021-11-29 17:40:18,140] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Successfully synced group in generation Generation{generationId=9, memberId='connect-1-58f0f3b0-d87f-4ddf-bee5-5312b2dad26e', protocol='sessioned'} (org.apache.kafka.connect.runtime.distributed.WorkerCoordinator:757)
[2021-11-29 17:40:18,140] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Joined group at generation 9 with protocol version 2 and got assignment: Assignment{error=0, leader='connect-1-58f0f3b0-d87f-4ddf-bee5-5312b2dad26e', leaderUrl='http://127.0.0.1:8083/', offset=16, connectorIds=[hudi-sink], taskIds=[hudi-sink-0, hudi-sink-1, hudi-sink-2, hudi-sink-3], revokedConnectorIds=[], revokedTaskIds=[], delay=0} with rebalance delay: 0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1848)
[2021-11-29 17:40:18,142] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Starting connectors and tasks using config offset 16 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1378)
[2021-11-29 17:40:18,143] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Starting task hudi-sink-0 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1420)
[2021-11-29 17:40:18,144] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Starting task hudi-sink-2 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1420)
[2021-11-29 17:40:18,143] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Starting task hudi-sink-1 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1420)
[2021-11-29 17:40:18,144] INFO [hudi-sink|task-2] Creating task hudi-sink-2 (org.apache.kafka.connect.runtime.Worker:503)
[2021-11-29 17:40:18,144] INFO [hudi-sink|task-0] Creating task hudi-sink-0 (org.apache.kafka.connect.runtime.Worker:503)
[2021-11-29 17:40:18,144] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Starting task hudi-sink-3 (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1420)
[2021-11-29 17:40:18,144] INFO [hudi-sink|task-1] Creating task hudi-sink-1 (org.apache.kafka.connect.runtime.Worker:503)
[2021-11-29 17:40:18,145] INFO [hudi-sink|task-3] Creating task hudi-sink-3 (org.apache.kafka.connect.runtime.Worker:503)
[2021-11-29 17:40:18,147] INFO [hudi-sink|task-1] ConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig:376)
[2021-11-29 17:40:18,147] INFO [hudi-sink|task-3] ConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig:376)
[2021-11-29 17:40:18,147] INFO [hudi-sink|task-0] ConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig:376)
[2021-11-29 17:40:18,147] INFO [hudi-sink|task-2] ConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig:376)
[2021-11-29 17:40:18,148] INFO [hudi-sink|task-0] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,148] INFO [hudi-sink|task-3] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,147] INFO [hudi-sink|task-1] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,149] INFO [hudi-sink|task-2] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,155] INFO [hudi-sink|task-1] TaskConfig values:
task.class = class org.apache.hudi.connect.HoodieSinkTask
(org.apache.kafka.connect.runtime.TaskConfig:376)
[2021-11-29 17:40:18,156] INFO [hudi-sink|task-0] TaskConfig values:
task.class = class org.apache.hudi.connect.HoodieSinkTask
(org.apache.kafka.connect.runtime.TaskConfig:376)
[2021-11-29 17:40:18,155] INFO [hudi-sink|task-3] TaskConfig values:
task.class = class org.apache.hudi.connect.HoodieSinkTask
(org.apache.kafka.connect.runtime.TaskConfig:376)
[2021-11-29 17:40:18,156] INFO [hudi-sink|task-2] TaskConfig values:
task.class = class org.apache.hudi.connect.HoodieSinkTask
(org.apache.kafka.connect.runtime.TaskConfig:376)
[2021-11-29 17:40:18,157] INFO [hudi-sink|task-3] Instantiated task hudi-sink-3 with version 0.1.0 of type org.apache.hudi.connect.HoodieSinkTask (org.apache.kafka.connect.runtime.Worker:518)
[2021-11-29 17:40:18,157] INFO [hudi-sink|task-0] Instantiated task hudi-sink-0 with version 0.1.0 of type org.apache.hudi.connect.HoodieSinkTask (org.apache.kafka.connect.runtime.Worker:518)
[2021-11-29 17:40:18,157] INFO [hudi-sink|task-1] Instantiated task hudi-sink-1 with version 0.1.0 of type org.apache.hudi.connect.HoodieSinkTask (org.apache.kafka.connect.runtime.Worker:518)
[2021-11-29 17:40:18,160] INFO [hudi-sink|task-2] Instantiated task hudi-sink-2 with version 0.1.0 of type org.apache.hudi.connect.HoodieSinkTask (org.apache.kafka.connect.runtime.Worker:518)
[2021-11-29 17:40:18,162] INFO [hudi-sink|task-0] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = key
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2021-11-29 17:40:18,163] INFO [hudi-sink|task-1] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = key
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2021-11-29 17:40:18,162] INFO [hudi-sink|task-3] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = key
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2021-11-29 17:40:18,162] INFO [hudi-sink|task-2] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = key
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2021-11-29 17:40:18,164] INFO [hudi-sink|task-3] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = value
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2021-11-29 17:40:18,164] INFO [hudi-sink|task-1] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = value
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2021-11-29 17:40:18,164] INFO [hudi-sink|task-0] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = value
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2021-11-29 17:40:18,165] INFO [hudi-sink|task-1] Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task hudi-sink-1 using the connector config (org.apache.kafka.connect.runtime.Worker:533)
[2021-11-29 17:40:18,165] INFO [hudi-sink|task-3] Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task hudi-sink-3 using the connector config (org.apache.kafka.connect.runtime.Worker:533)
[2021-11-29 17:40:18,164] INFO [hudi-sink|task-2] StringConverterConfig values:
converter.encoding = UTF-8
converter.type = value
(org.apache.kafka.connect.storage.StringConverterConfig:376)
[2021-11-29 17:40:18,166] INFO [hudi-sink|task-3] Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task hudi-sink-3 using the connector config (org.apache.kafka.connect.runtime.Worker:539)
[2021-11-29 17:40:18,166] INFO [hudi-sink|task-1] Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task hudi-sink-1 using the connector config (org.apache.kafka.connect.runtime.Worker:539)
[2021-11-29 17:40:18,165] INFO [hudi-sink|task-0] Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task hudi-sink-0 using the connector config (org.apache.kafka.connect.runtime.Worker:533)
[2021-11-29 17:40:18,166] INFO [hudi-sink|task-2] Set up the key converter class org.apache.kafka.connect.storage.StringConverter for task hudi-sink-2 using the connector config (org.apache.kafka.connect.runtime.Worker:533)
[2021-11-29 17:40:18,167] INFO [hudi-sink|task-1] Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task hudi-sink-1 using the worker config (org.apache.kafka.connect.runtime.Worker:544)
[2021-11-29 17:40:18,167] INFO [hudi-sink|task-3] Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task hudi-sink-3 using the worker config (org.apache.kafka.connect.runtime.Worker:544)
[2021-11-29 17:40:18,167] INFO [hudi-sink|task-0] Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task hudi-sink-0 using the connector config (org.apache.kafka.connect.runtime.Worker:539)
[2021-11-29 17:40:18,167] INFO [hudi-sink|task-2] Set up the value converter class org.apache.kafka.connect.storage.StringConverter for task hudi-sink-2 using the connector config (org.apache.kafka.connect.runtime.Worker:539)
[2021-11-29 17:40:18,168] INFO [hudi-sink|task-0] Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task hudi-sink-0 using the worker config (org.apache.kafka.connect.runtime.Worker:544)
[2021-11-29 17:40:18,170] INFO [hudi-sink|task-2] Set up the header converter class org.apache.kafka.connect.storage.SimpleHeaderConverter for task hudi-sink-2 using the worker config (org.apache.kafka.connect.runtime.Worker:544)
[2021-11-29 17:40:18,182] INFO [hudi-sink|task-3] Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:626)
[2021-11-29 17:40:18,183] INFO [hudi-sink|task-1] Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:626)
[2021-11-29 17:40:18,182] INFO [hudi-sink|task-0] Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:626)
[2021-11-29 17:40:18,182] INFO [hudi-sink|task-2] Initializing: org.apache.kafka.connect.runtime.TransformationChain{} (org.apache.kafka.connect.runtime.Worker:626)
[2021-11-29 17:40:18,184] INFO [hudi-sink|task-3] SinkConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.SinkConnectorConfig:376)
[2021-11-29 17:40:18,184] INFO [hudi-sink|task-1] SinkConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.SinkConnectorConfig:376)
[2021-11-29 17:40:18,184] INFO [hudi-sink|task-2] SinkConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.SinkConnectorConfig:376)
[2021-11-29 17:40:18,184] INFO [hudi-sink|task-0] SinkConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.SinkConnectorConfig:376)
[2021-11-29 17:40:18,185] INFO [hudi-sink|task-2] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,184] INFO [hudi-sink|task-3] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,185] INFO [hudi-sink|task-0] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,185] INFO [hudi-sink|task-1] EnrichedConnectorConfig values:
config.action.reload = restart
connector.class = org.apache.hudi.connect.HoodieSinkConnector
errors.deadletterqueue.context.headers.enable = false
errors.deadletterqueue.topic.name =
errors.deadletterqueue.topic.replication.factor = 3
errors.log.enable = false
errors.log.include.messages = false
errors.retry.delay.max.ms = 60000
errors.retry.timeout = 0
errors.tolerance = none
header.converter = null
key.converter = class org.apache.kafka.connect.storage.StringConverter
name = hudi-sink
predicates = []
tasks.max = 4
topics = [hudi-test-topic]
topics.regex =
transforms = []
value.converter = class org.apache.kafka.connect.storage.StringConverter
(org.apache.kafka.connect.runtime.ConnectorConfig$EnrichedConnectorConfig:376)
[2021-11-29 17:40:18,191] INFO [hudi-sink|task-0] ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafkabroker:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = connector-consumer-hudi-sink-0
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = connect-hudi-sink
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2021-11-29 17:40:18,193] INFO [hudi-sink|task-2] ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafkabroker:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = connector-consumer-hudi-sink-2
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = connect-hudi-sink
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2021-11-29 17:40:18,194] INFO [hudi-sink|task-1] ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafkabroker:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = connector-consumer-hudi-sink-1
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = connect-hudi-sink
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2021-11-29 17:40:18,197] INFO [hudi-sink|task-3] ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [kafkabroker:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = connector-consumer-hudi-sink-3
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = connect-hudi-sink
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2021-11-29 17:40:18,206] WARN [hudi-sink|task-0] The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:18,207] WARN [hudi-sink|task-0] The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:18,207] WARN [hudi-sink|task-1] The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:18,207] INFO [hudi-sink|task-0] Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:18,207] WARN [hudi-sink|task-1] The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:18,207] INFO [hudi-sink|task-0] Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:18,207] INFO [hudi-sink|task-0] Kafka startTimeMs: 1638236418207 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:18,208] WARN [hudi-sink|task-3] The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:18,208] WARN [hudi-sink|task-3] The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:18,208] INFO [hudi-sink|task-1] Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:18,208] WARN [hudi-sink|task-2] The configuration 'metrics.context.connect.kafka.cluster.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:18,208] WARN [hudi-sink|task-2] The configuration 'metrics.context.connect.group.id' was supplied but isn't a known config. (org.apache.kafka.clients.consumer.ConsumerConfig:384)
[2021-11-29 17:40:18,208] INFO [hudi-sink|task-1] Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:18,208] INFO [hudi-sink|task-1] Kafka startTimeMs: 1638236418207 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:18,209] INFO [hudi-sink|task-2] Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:18,209] INFO [hudi-sink|task-2] Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:18,209] INFO [hudi-sink|task-2] Kafka startTimeMs: 1638236418208 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:18,209] INFO [hudi-sink|task-3] Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:18,209] INFO [hudi-sink|task-3] Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:18,209] INFO [hudi-sink|task-3] Kafka startTimeMs: 1638236418208 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:18,227] INFO [Worker clientId=connect-1, groupId=hudi-connect-cluster] Finished starting connectors and tasks (org.apache.kafka.connect.runtime.distributed.DistributedHerder:1406)
[2021-11-29 17:40:18,232] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Subscribed to topic(s): hudi-test-topic (org.apache.kafka.clients.consumer.KafkaConsumer:966)
[2021-11-29 17:40:18,232] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Subscribed to topic(s): hudi-test-topic (org.apache.kafka.clients.consumer.KafkaConsumer:966)
[2021-11-29 17:40:18,232] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Subscribed to topic(s): hudi-test-topic (org.apache.kafka.clients.consumer.KafkaConsumer:966)
[2021-11-29 17:40:18,232] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Subscribed to topic(s): hudi-test-topic (org.apache.kafka.clients.consumer.KafkaConsumer:966)
[2021-11-29 17:40:18,267] INFO [hudi-sink|task-3] ProducerConfig values:
acks = -1
batch.size = 16384
bootstrap.servers = [kafkabroker:9092]
buffer.memory = 33554432
client.dns.lookup = use_all_dns_ips
client.id = producer-4
compression.type = none
connections.max.idle.ms = 540000
delivery.timeout.ms = 120000
enable.idempotence = true
interceptor.classes = []
key.serializer = class org.apache.kafka.common.serialization.StringSerializer
linger.ms = 0
max.block.ms = 60000
max.in.flight.requests.per.connection = 5
max.request.size = 1048576
metadata.max.age.ms = 300000
metadata.max.idle.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner
receive.buffer.bytes = 32768
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer
(org.apache.kafka.clients.producer.ProducerConfig:376)
[2021-11-29 17:40:18,273] INFO [hudi-sink|task-3] Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:18,274] INFO [hudi-sink|task-3] Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:18,275] INFO [hudi-sink|task-3] Kafka startTimeMs: 1638236418273 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:18,278] INFO [hudi-sink|task-3] ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = latest
bootstrap.servers = [kafkabroker:9092]
check.crcs = true
client.dns.lookup = use_all_dns_ips
client.id = consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = true
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
internal.throw.on.fetch.stable.offset.unsupported = false
isolation.level = read_uncommitted
key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor]
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 45000
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer
(org.apache.kafka.clients.consumer.ConsumerConfig:376)
[2021-11-29 17:40:18,279] INFO [hudi-sink|task-3] [Producer clientId=producer-4] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:18,288] INFO [hudi-sink|task-3] Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:18,288] INFO [hudi-sink|task-3] Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:18,289] INFO [hudi-sink|task-3] Kafka startTimeMs: 1638236418288 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:40:18,289] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Subscribed to topic(s): hudi-control-topic (org.apache.kafka.clients.consumer.KafkaConsumer:966)
[2021-11-29 17:40:18,290] INFO [hudi-sink|task-3] WorkerSinkTask{id=hudi-sink-3} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:309)
[2021-11-29 17:40:18,290] INFO [hudi-sink|task-1] WorkerSinkTask{id=hudi-sink-1} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:309)
[2021-11-29 17:40:18,290] INFO [hudi-sink|task-0] WorkerSinkTask{id=hudi-sink-0} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:309)
[2021-11-29 17:40:18,290] INFO [hudi-sink|task-2] WorkerSinkTask{id=hudi-sink-2} Sink task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSinkTask:309)
[2021-11-29 17:40:18,296] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:18,296] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:18,297] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Discovered group coordinator localhost:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:849)
[2021-11-29 17:40:18,297] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:18,297] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:18,297] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Cluster ID: O-MiC5q5R8awBOunyQdBWA (org.apache.kafka.clients.Metadata:287)
[2021-11-29 17:40:18,297] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Discovered group coordinator localhost:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:849)
[2021-11-29 17:40:18,297] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Discovered group coordinator localhost:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:849)
[2021-11-29 17:40:18,298] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Discovered group coordinator localhost:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:849)
[2021-11-29 17:40:18,298] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Discovered group coordinator localhost:9092 (id: 2147483647 rack: null) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:849)
[2021-11-29 17:40:18,299] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,298] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,300] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,298] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,299] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,315] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Request joining group due to: need to re-join with the given member-id (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:988)
[2021-11-29 17:40:18,315] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Request joining group due to: need to re-join with the given member-id (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:988)
[2021-11-29 17:40:18,315] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Request joining group due to: need to re-join with the given member-id (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:988)
[2021-11-29 17:40:18,316] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,315] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Request joining group due to: need to re-join with the given member-id (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:988)
[2021-11-29 17:40:18,315] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,315] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Request joining group due to: need to re-join with the given member-id (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:988)
[2021-11-29 17:40:18,316] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,316] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,316] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,319] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Successfully joined group with generation Generation{generationId=3, memberId='connector-consumer-hudi-sink-0-9a8b1171-540d-45f5-b81d-94ff84d109b8', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:591)
[2021-11-29 17:40:18,320] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Successfully joined group with generation Generation{generationId=1, memberId='consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4-0d3300a8-c406-4b36-8bd6-7ac7c01d32ad', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:591)
[2021-11-29 17:40:18,322] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Finished assignment for group at generation 1: {consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4-0d3300a8-c406-4b36-8bd6-7ac7c01d32ad=Assignment(partitions=[hudi-control-topic-0])} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:626)
[2021-11-29 17:40:18,322] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Finished assignment for group at generation 3: {connector-consumer-hudi-sink-0-9a8b1171-540d-45f5-b81d-94ff84d109b8=Assignment(partitions=[hudi-test-topic-0, hudi-test-topic-1, hudi-test-topic-2, hudi-test-topic-3])} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:626)
[2021-11-29 17:40:18,329] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] SyncGroup failed: The group began another rebalance. Need to re-join the group. Sent generation was Generation{generationId=3, memberId='connector-consumer-hudi-sink-0-9a8b1171-540d-45f5-b81d-94ff84d109b8', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:780)
[2021-11-29 17:40:18,329] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] (Re-)joining group (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:535)
[2021-11-29 17:40:18,329] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Successfully synced group in generation Generation{generationId=1, memberId='consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4-0d3300a8-c406-4b36-8bd6-7ac7c01d32ad', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:757)
[2021-11-29 17:40:18,330] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Notifying assignor about the new Assignment(partitions=[hudi-control-topic-0]) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:276)
[2021-11-29 17:40:18,330] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Adding newly assigned partitions: hudi-control-topic-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:288)
[2021-11-29 17:40:18,331] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Successfully joined group with generation Generation{generationId=4, memberId='connector-consumer-hudi-sink-3-226c1313-5b5c-41a0-b458-bb012bf3ff49', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:591)
[2021-11-29 17:40:18,331] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Successfully joined group with generation Generation{generationId=4, memberId='connector-consumer-hudi-sink-0-9a8b1171-540d-45f5-b81d-94ff84d109b8', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:591)
[2021-11-29 17:40:18,331] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Successfully joined group with generation Generation{generationId=4, memberId='connector-consumer-hudi-sink-2-0e9aeb3b-c45a-43ee-84c6-d0b9ab6e1b1c', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:591)
[2021-11-29 17:40:18,331] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Successfully joined group with generation Generation{generationId=4, memberId='connector-consumer-hudi-sink-1-0303f605-44e4-49c5-9708-27932c102c3e', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:591)
[2021-11-29 17:40:18,333] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Finished assignment for group at generation 4: {connector-consumer-hudi-sink-3-226c1313-5b5c-41a0-b458-bb012bf3ff49=Assignment(partitions=[hudi-test-topic-3]), connector-consumer-hudi-sink-0-9a8b1171-540d-45f5-b81d-94ff84d109b8=Assignment(partitions=[hudi-test-topic-0]), connector-consumer-hudi-sink-1-0303f605-44e4-49c5-9708-27932c102c3e=Assignment(partitions=[hudi-test-topic-1]), connector-consumer-hudi-sink-2-0e9aeb3b-c45a-43ee-84c6-d0b9ab6e1b1c=Assignment(partitions=[hudi-test-topic-2])} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:626)
[2021-11-29 17:40:18,339] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Successfully synced group in generation Generation{generationId=4, memberId='connector-consumer-hudi-sink-3-226c1313-5b5c-41a0-b458-bb012bf3ff49', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:757)
[2021-11-29 17:40:18,339] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Successfully synced group in generation Generation{generationId=4, memberId='connector-consumer-hudi-sink-1-0303f605-44e4-49c5-9708-27932c102c3e', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:757)
[2021-11-29 17:40:18,339] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Successfully synced group in generation Generation{generationId=4, memberId='connector-consumer-hudi-sink-0-9a8b1171-540d-45f5-b81d-94ff84d109b8', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:757)
[2021-11-29 17:40:18,339] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Successfully synced group in generation Generation{generationId=4, memberId='connector-consumer-hudi-sink-2-0e9aeb3b-c45a-43ee-84c6-d0b9ab6e1b1c', protocol='range'} (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:757)
[2021-11-29 17:40:18,340] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Notifying assignor about the new Assignment(partitions=[hudi-test-topic-0]) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:276)
[2021-11-29 17:40:18,340] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Notifying assignor about the new Assignment(partitions=[hudi-test-topic-1]) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:276)
[2021-11-29 17:40:18,339] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Notifying assignor about the new Assignment(partitions=[hudi-test-topic-3]) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:276)
[2021-11-29 17:40:18,340] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Found no committed offset for partition hudi-control-topic-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1362)
[2021-11-29 17:40:18,340] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Adding newly assigned partitions: hudi-test-topic-1 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:288)
[2021-11-29 17:40:18,340] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Adding newly assigned partitions: hudi-test-topic-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:288)
[2021-11-29 17:40:18,340] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Notifying assignor about the new Assignment(partitions=[hudi-test-topic-2]) (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:276)
[2021-11-29 17:40:18,340] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Adding newly assigned partitions: hudi-test-topic-3 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:288)
[2021-11-29 17:40:18,341] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Adding newly assigned partitions: hudi-test-topic-2 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:288)
[2021-11-29 17:40:18,342] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Found no committed offset for partition hudi-test-topic-2 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1362)
[2021-11-29 17:40:18,342] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Found no committed offset for partition hudi-test-topic-1 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1362)
[2021-11-29 17:40:18,342] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Found no committed offset for partition hudi-test-topic-0 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1362)
[2021-11-29 17:40:18,342] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Found no committed offset for partition hudi-test-topic-3 (org.apache.kafka.clients.consumer.internals.ConsumerCoordinator:1362)
[2021-11-29 17:40:18,346] INFO [hudi-sink|task-3] [Consumer clientId=consumer-hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417-4, groupId=hudi-control-group350e64d2-44e8-4193-8c93-af93d1c94417] Resetting offset for partition hudi-control-topic-0 to position FetchPosition{offset=1, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:18,347] INFO [hudi-sink|task-1] [Consumer clientId=connector-consumer-hudi-sink-1, groupId=connect-hudi-sink] Resetting offset for partition hudi-test-topic-1 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:18,348] INFO [hudi-sink|task-0] [Consumer clientId=connector-consumer-hudi-sink-0, groupId=connect-hudi-sink] Resetting offset for partition hudi-test-topic-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:18,347] INFO [hudi-sink|task-2] [Consumer clientId=connector-consumer-hudi-sink-2, groupId=connect-hudi-sink] Resetting offset for partition hudi-test-topic-2 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:18,348] INFO [hudi-sink|task-3] [Consumer clientId=connector-consumer-hudi-sink-3, groupId=connect-hudi-sink] Resetting offset for partition hudi-test-topic-3 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:9092 (id: 0 rack: null)], epoch=0}}. (org.apache.kafka.clients.consumer.internals.SubscriptionState:398)
[2021-11-29 17:40:18,499] INFO [hudi-sink|task-0] The value of hoodie.datasource.write.keygenerator.type is empty, using SIMPLE (org.apache.hudi.keygen.factory.HoodieAvroKeyGeneratorFactory:63)
[2021-11-29 17:40:18,499] INFO [hudi-sink|task-2] The value of hoodie.datasource.write.keygenerator.type is empty, using SIMPLE (org.apache.hudi.keygen.factory.HoodieAvroKeyGeneratorFactory:63)
[2021-11-29 17:40:18,499] INFO [hudi-sink|task-1] The value of hoodie.datasource.write.keygenerator.type is empty, using SIMPLE (org.apache.hudi.keygen.factory.HoodieAvroKeyGeneratorFactory:63)
[2021-11-29 17:40:18,499] INFO [hudi-sink|task-3] The value of hoodie.datasource.write.keygenerator.type is empty, using SIMPLE (org.apache.hudi.keygen.factory.HoodieAvroKeyGeneratorFactory:63)
[2021-11-29 17:40:18,646] WARN [hudi-sink|task-0] Unable to load native-hadoop library for your platform... using builtin-java classes where applicable (org.apache.hadoop.util.NativeCodeLoader:62)
[2021-11-29 17:40:19,046] INFO [hudi-sink|task-2] Logging initialized @12521ms to org.apache.hudi.org.eclipse.jetty.util.log.Slf4jLog (org.apache.hudi.org.eclipse.jetty.util.log:193)
[2021-11-29 17:40:19,191] INFO [hudi-sink|task-2]
__ __ _
/ /____ _ _ __ ____ _ / /(_)____
__ / // __ `/| | / // __ `// // // __ \
/ /_/ // /_/ / | |/ // /_/ // // // / / /
\____/ \__,_/ |___/ \__,_//_//_//_/ /_/
https://javalin.io/documentation
(io.javalin.Javalin:134)
[2021-11-29 17:40:19,222] INFO [hudi-sink|task-2] Starting Javalin ... (io.javalin.Javalin:139)
[2021-11-29 17:40:19,341] INFO [hudi-sink|task-2] Listening on http://localhost:63485/ (io.javalin.Javalin:113)
[2021-11-29 17:40:19,341] INFO [hudi-sink|task-2] Javalin started in 153ms \o/ (io.javalin.Javalin:149)
[2021-11-29 17:40:19,347] INFO [hudi-sink|task-0]
__ __ _
/ /____ _ _ __ ____ _ / /(_)____
__ / // __ `/| | / // __ `// // // __ \
/ /_/ // /_/ / | |/ // /_/ // // // / / /
\____/ \__,_/ |___/ \__,_//_//_//_/ /_/
https://javalin.io/documentation
(io.javalin.Javalin:134)
[2021-11-29 17:40:19,347] INFO [hudi-sink|task-0] Starting Javalin ... (io.javalin.Javalin:139)
[2021-11-29 17:40:19,353] INFO [hudi-sink|task-0] Listening on http://localhost:63487/ (io.javalin.Javalin:113)
[2021-11-29 17:40:19,353] INFO [hudi-sink|task-0] Javalin started in 6ms \o/ (io.javalin.Javalin:149)
[2021-11-29 17:40:19,359] INFO [hudi-sink|task-1]
__ __ _
/ /____ _ _ __ ____ _ / /(_)____
__ / // __ `/| | / // __ `// // // __ \
/ /_/ // /_/ / | |/ // /_/ // // // / / /
\____/ \__,_/ |___/ \__,_//_//_//_/ /_/
https://javalin.io/documentation
(io.javalin.Javalin:134)
[2021-11-29 17:40:19,359] INFO [hudi-sink|task-1] Starting Javalin ... (io.javalin.Javalin:139)
[2021-11-29 17:40:19,365] INFO [hudi-sink|task-1] Listening on http://localhost:63489/ (io.javalin.Javalin:113)
[2021-11-29 17:40:19,365] INFO [hudi-sink|task-1] Javalin started in 6ms \o/ (io.javalin.Javalin:149)
[2021-11-29 17:40:19,371] INFO [hudi-sink|task-3]
__ __ _
/ /____ _ _ __ ____ _ / /(_)____
__ / // __ `/| | / // __ `// // // __ \
/ /_/ // /_/ / | |/ // /_/ // // // / / /
\____/ \__,_/ |___/ \__,_//_//_//_/ /_/
https://javalin.io/documentation
(io.javalin.Javalin:134)
[2021-11-29 17:40:19,371] INFO [hudi-sink|task-3] Starting Javalin ... (io.javalin.Javalin:139)
[2021-11-29 17:40:19,378] INFO [hudi-sink|task-3] Listening on http://localhost:63491/ (io.javalin.Javalin:113)
[2021-11-29 17:40:19,378] INFO [hudi-sink|task-3] Javalin started in 7ms \o/ (io.javalin.Javalin:149)
[2021-11-29 17:40:19,391] INFO [hudi-sink|task-0] The value of hoodie.datasource.write.keygenerator.type is empty, using SIMPLE (org.apache.hudi.keygen.factory.HoodieAvroKeyGeneratorFactory:63)
[2021-11-29 17:40:19,408] INFO [hudi-sink|task-0]
__ __ _
/ /____ _ _ __ ____ _ / /(_)____
__ / // __ `/| | / // __ `// // // __ \
/ /_/ // /_/ / | |/ // /_/ // // // / / /
\____/ \__,_/ |___/ \__,_//_//_//_/ /_/
https://javalin.io/documentation
(io.javalin.Javalin:134)
[2021-11-29 17:40:19,408] INFO [hudi-sink|task-0] Starting Javalin ... (io.javalin.Javalin:139)
[2021-11-29 17:40:19,413] INFO [hudi-sink|task-0] Listening on http://localhost:63493/ (io.javalin.Javalin:113)
[2021-11-29 17:40:19,414] INFO [hudi-sink|task-0] Javalin started in 7ms \o/ (io.javalin.Javalin:149)
[2021-11-29 17:40:19,476] INFO [hudi-sink|task-0] AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:40:19,477] INFO [hudi-sink|task-0] Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:40:19,477] INFO [hudi-sink|task-0] Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:40:19,477] INFO [hudi-sink|task-0] Kafka startTimeMs: 1638236419477 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:42:17,715] INFO [hudi-sink|task-0] AdminClientConfig values:
bootstrap.servers = [kafkabroker:9092]
client.dns.lookup = use_all_dns_ips
client.id =
connections.max.idle.ms = 300000
default.api.timeout.ms = 60000
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
receive.buffer.bytes = 65536
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retries = 2147483647
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
socket.connection.setup.timeout.max.ms = 30000
socket.connection.setup.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2]
ssl.endpoint.identification.algorithm = https
ssl.engine.factory.class = null
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.certificate.chain = null
ssl.keystore.key = null
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLSv1.2
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.certificates = null
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
(org.apache.kafka.clients.admin.AdminClientConfig:376)
[2021-11-29 17:42:17,721] INFO [hudi-sink|task-0] Kafka version: 3.0.0 (org.apache.kafka.common.utils.AppInfoParser:119)
[2021-11-29 17:42:17,722] INFO [hudi-sink|task-0] Kafka commitId: unknown (org.apache.kafka.common.utils.AppInfoParser:120)
[2021-11-29 17:42:17,722] INFO [hudi-sink|task-0] Kafka startTimeMs: 1638236537721 (org.apache.kafka.common.utils.AppInfoParser:121)
[2021-11-29 17:43:19,763] INFO [hudi-sink|task-3] Found checksum error: b[0, 91]=706172746974696f6e5f342f31463946373736444442363643353038373631333144443446443036303332365f302d302d305f32303231313132393137343031393530362e706172717565742e6d61726b65722e415050454e440a (org.apache.hadoop.fs.FSInputChecker:309)
org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:347)
at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:303)
at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:252)
at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:197)
at java.io.DataInputStream.read(DataInputStream.java:149)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
at java.io.BufferedReader.readLine(BufferedReader.java:389)
at java.io.BufferedReader$1.hasNext(BufferedReader.java:571)
at java.util.Iterator.forEachRemaining(Iterator.java:115)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
at org.apache.hudi.common.util.FileIOUtils.readAsUTFStringLines(FileIOUtils.java:88)
at org.apache.hudi.common.util.MarkerUtils.readMarkersFromFile(MarkerUtils.java:208)
at org.apache.hudi.common.util.MarkerUtils.lambda$readTimelineServerBasedMarkersFromFileSystem$141c8e72$1(MarkerUtils.java:185)
at org.apache.hudi.common.fs.FSUtils.lambda$parallelizeFilesProcess$1f9929d5$1(FSUtils.java:700)
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:68)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
at org.apache.hudi.client.common.HoodieJavaEngineContext.mapToPair(HoodieJavaEngineContext.java:130)
at org.apache.hudi.common.fs.FSUtils.parallelizeFilesProcess(FSUtils.java:699)
at org.apache.hudi.common.fs.FSUtils.parallelizeSubPathProcess(FSUtils.java:682)
at org.apache.hudi.common.util.MarkerUtils.readTimelineServerBasedMarkersFromFileSystem(MarkerUtils.java:180)
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.syncMarkersFromFileSystem(MarkerDirState.java:248)
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.<init>(MarkerDirState.java:100)
at org.apache.hudi.timeline.service.handlers.MarkerHandler.getMarkerDirState(MarkerHandler.java:188)
at org.apache.hudi.timeline.service.handlers.MarkerHandler.createMarker(MarkerHandler.java:156)
at org.apache.hudi.timeline.service.RequestHandler.lambda$registerMarkerAPI$27(RequestHandler.java:448)
at org.apache.hudi.timeline.service.RequestHandler$ViewHandler.handle(RequestHandler.java:497)
at io.javalin.security.SecurityUtil.noopAccessManager(SecurityUtil.kt:22)
at io.javalin.Javalin.lambda$addHandler$0(Javalin.java:606)
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:46)
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:17)
at io.javalin.core.JavalinServlet$service$1.invoke(JavalinServlet.kt:143)
at io.javalin.core.JavalinServlet$service$2.invoke(JavalinServlet.kt:41)
at io.javalin.core.JavalinServlet.service(JavalinServlet.kt:107)
at io.javalin.core.util.JettyServerUtil$initialize$httpHandler$1.doHandle(JettyServerUtil.kt:72)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at org.apache.hudi.org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
at org.apache.hudi.org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1668)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at org.apache.hudi.org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:61)
at org.apache.hudi.org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:174)
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.apache.hudi.org.eclipse.jetty.server.Server.handle(Server.java:502)
at org.apache.hudi.org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)
at org.apache.hudi.org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
at org.apache.hudi.org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
at org.apache.hudi.org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.apache.hudi.org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
at org.apache.hudi.org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
at java.lang.Thread.run(Thread.java:748)
[2021-11-29 17:43:19,792] INFO [hudi-sink|task-0] Found checksum error: b[0, 91]=706172746974696f6e5f342f31463946373736444442363643353038373631333144443446443036303332365f302d302d305f32303231313132393137343031393530362e706172717565742e6d61726b65722e415050454e440a (org.apache.hadoop.fs.FSInputChecker:309)
org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:347)
at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:303)
at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:252)
at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:197)
at java.io.DataInputStream.read(DataInputStream.java:149)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
at java.io.BufferedReader.readLine(BufferedReader.java:389)
at java.io.BufferedReader$1.hasNext(BufferedReader.java:571)
at java.util.Iterator.forEachRemaining(Iterator.java:115)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
at org.apache.hudi.common.util.FileIOUtils.readAsUTFStringLines(FileIOUtils.java:88)
at org.apache.hudi.common.util.MarkerUtils.readMarkersFromFile(MarkerUtils.java:208)
at org.apache.hudi.common.util.MarkerUtils.lambda$readTimelineServerBasedMarkersFromFileSystem$141c8e72$1(MarkerUtils.java:185)
at org.apache.hudi.common.fs.FSUtils.lambda$parallelizeFilesProcess$1f9929d5$1(FSUtils.java:700)
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:68)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
at org.apache.hudi.client.common.HoodieJavaEngineContext.mapToPair(HoodieJavaEngineContext.java:130)
at org.apache.hudi.common.fs.FSUtils.parallelizeFilesProcess(FSUtils.java:699)
at org.apache.hudi.common.fs.FSUtils.parallelizeSubPathProcess(FSUtils.java:682)
at org.apache.hudi.common.util.MarkerUtils.readTimelineServerBasedMarkersFromFileSystem(MarkerUtils.java:180)
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.syncMarkersFromFileSystem(MarkerDirState.java:248)
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.<init>(MarkerDirState.java:100)
at org.apache.hudi.timeline.service.handlers.MarkerHandler.getMarkerDirState(MarkerHandler.java:188)
at org.apache.hudi.timeline.service.handlers.MarkerHandler.createMarker(MarkerHandler.java:156)
at org.apache.hudi.timeline.service.RequestHandler.lambda$registerMarkerAPI$27(RequestHandler.java:448)
at org.apache.hudi.timeline.service.RequestHandler$ViewHandler.handle(RequestHandler.java:497)
at io.javalin.security.SecurityUtil.noopAccessManager(SecurityUtil.kt:22)
at io.javalin.Javalin.lambda$addHandler$0(Javalin.java:606)
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:46)
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:17)
at io.javalin.core.JavalinServlet$service$1.invoke(JavalinServlet.kt:143)
at io.javalin.core.JavalinServlet$service$2.invoke(JavalinServlet.kt:41)
at io.javalin.core.JavalinServlet.service(JavalinServlet.kt:107)
at io.javalin.core.util.JettyServerUtil$initialize$httpHandler$1.doHandle(JettyServerUtil.kt:72)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at org.apache.hudi.org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
at org.apache.hudi.org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1668)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at org.apache.hudi.org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:61)
at org.apache.hudi.org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:174)
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.apache.hudi.org.eclipse.jetty.server.Server.handle(Server.java:502)
at org.apache.hudi.org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)
at org.apache.hudi.org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
at org.apache.hudi.org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
at org.apache.hudi.org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.apache.hudi.org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
at org.apache.hudi.org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
at java.lang.Thread.run(Thread.java:748)
17:43:19.794 [qtp298869091-146] ERROR org.apache.hudi.timeline.service.RequestHandler - Got runtime exception servicing request markername=partition_1%2F5ACAD57E1A83AF31F389CDA0BC3D595C_0-0-0_20211129174019506.parquet.marker.APPEND&markerdirpath=file%3A%2Ftmp%2Fhoodie%2Fhudi-test-topic%2F.hoodie%2F.temp%2F20211129174217738
org.apache.hudi.exception.HoodieException: Error occurs when executing mapToPair
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:70) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[?:1.8.0_265]
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_265]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_265]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) ~[?:1.8.0_265]
at org.apache.hudi.client.common.HoodieJavaEngineContext.mapToPair(HoodieJavaEngineContext.java:130) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.fs.FSUtils.parallelizeFilesProcess(FSUtils.java:699) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.fs.FSUtils.parallelizeSubPathProcess(FSUtils.java:682) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.readTimelineServerBasedMarkersFromFileSystem(MarkerUtils.java:180) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.syncMarkersFromFileSystem(MarkerDirState.java:248) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.<init>(MarkerDirState.java:100) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.handlers.MarkerHandler.getMarkerDirState(MarkerHandler.java:188) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.handlers.MarkerHandler.createMarker(MarkerHandler.java:156) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.RequestHandler.lambda$registerMarkerAPI$27(RequestHandler.java:448) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.RequestHandler$ViewHandler.handle(RequestHandler.java:497) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.security.SecurityUtil.noopAccessManager(SecurityUtil.kt:22) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.Javalin.lambda$addHandler$0(Javalin.java:606) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:46) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:17) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet$service$1.invoke(JavalinServlet.kt:143) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet$service$2.invoke(JavalinServlet.kt:41) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet.service(JavalinServlet.kt:107) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.util.JettyServerUtil$initialize$httpHandler$1.doHandle(JettyServerUtil.kt:72) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1668) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:61) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:174) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.Server.handle(Server.java:502) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_265]
Caused by: java.io.UncheckedIOException: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at java.io.BufferedReader$1.hasNext(BufferedReader.java:574) ~[?:1.8.0_265]
at java.util.Iterator.forEachRemaining(Iterator.java:115) ~[?:1.8.0_265]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_265]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_265]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) ~[?:1.8.0_265]
at org.apache.hudi.common.util.FileIOUtils.readAsUTFStringLines(FileIOUtils.java:88) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.readMarkersFromFile(MarkerUtils.java:208) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.lambda$readTimelineServerBasedMarkersFromFileSystem$141c8e72$1(MarkerUtils.java:185) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.fs.FSUtils.lambda$parallelizeFilesProcess$1f9929d5$1(FSUtils.java:700) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:68) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 48 more
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:347) ~[hadoop-common-2.10.1.jar:?]
at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:303) ~[hadoop-common-2.10.1.jar:?]
at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:252) ~[hadoop-common-2.10.1.jar:?]
at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:197) ~[hadoop-common-2.10.1.jar:?]
at java.io.DataInputStream.read(DataInputStream.java:149) ~[?:1.8.0_265]
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284) ~[?:1.8.0_265]
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326) ~[?:1.8.0_265]
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178) ~[?:1.8.0_265]
at java.io.InputStreamReader.read(InputStreamReader.java:184) ~[?:1.8.0_265]
at java.io.BufferedReader.fill(BufferedReader.java:161) ~[?:1.8.0_265]
at java.io.BufferedReader.readLine(BufferedReader.java:324) ~[?:1.8.0_265]
at java.io.BufferedReader.readLine(BufferedReader.java:389) ~[?:1.8.0_265]
at java.io.BufferedReader$1.hasNext(BufferedReader.java:571) ~[?:1.8.0_265]
at java.util.Iterator.forEachRemaining(Iterator.java:115) ~[?:1.8.0_265]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_265]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_265]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) ~[?:1.8.0_265]
at org.apache.hudi.common.util.FileIOUtils.readAsUTFStringLines(FileIOUtils.java:88) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.readMarkersFromFile(MarkerUtils.java:208) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.lambda$readTimelineServerBasedMarkersFromFileSystem$141c8e72$1(MarkerUtils.java:185) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.fs.FSUtils.lambda$parallelizeFilesProcess$1f9929d5$1(FSUtils.java:700) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:68) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 48 more
17:43:19.782 [qtp647066238-134] ERROR org.apache.hudi.timeline.service.RequestHandler - Got runtime exception servicing request markername=partition_3%2FCB086FE9AB27F032E4E97C0644CCAE6B_0-0-0_20211129174019506.parquet.marker.APPEND&markerdirpath=file%3A%2Ftmp%2Fhoodie%2Fhudi-test-topic%2F.hoodie%2F.temp%2F20211129174217738
org.apache.hudi.exception.HoodieException: Error occurs when executing mapToPair
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:70) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[?:1.8.0_265]
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_265]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_265]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) ~[?:1.8.0_265]
at org.apache.hudi.client.common.HoodieJavaEngineContext.mapToPair(HoodieJavaEngineContext.java:130) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.fs.FSUtils.parallelizeFilesProcess(FSUtils.java:699) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.fs.FSUtils.parallelizeSubPathProcess(FSUtils.java:682) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.readTimelineServerBasedMarkersFromFileSystem(MarkerUtils.java:180) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.syncMarkersFromFileSystem(MarkerDirState.java:248) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.<init>(MarkerDirState.java:100) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.handlers.MarkerHandler.getMarkerDirState(MarkerHandler.java:188) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.handlers.MarkerHandler.createMarker(MarkerHandler.java:156) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.RequestHandler.lambda$registerMarkerAPI$27(RequestHandler.java:448) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.timeline.service.RequestHandler$ViewHandler.handle(RequestHandler.java:497) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.security.SecurityUtil.noopAccessManager(SecurityUtil.kt:22) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.Javalin.lambda$addHandler$0(Javalin.java:606) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:46) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:17) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet$service$1.invoke(JavalinServlet.kt:143) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet$service$2.invoke(JavalinServlet.kt:41) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.JavalinServlet.service(JavalinServlet.kt:107) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at io.javalin.core.util.JettyServerUtil$initialize$httpHandler$1.doHandle(JettyServerUtil.kt:72) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1668) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:61) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:174) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.Server.handle(Server.java:502) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_265]
Caused by: java.io.UncheckedIOException: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at java.io.BufferedReader$1.hasNext(BufferedReader.java:574) ~[?:1.8.0_265]
at java.util.Iterator.forEachRemaining(Iterator.java:115) ~[?:1.8.0_265]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_265]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_265]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) ~[?:1.8.0_265]
at org.apache.hudi.common.util.FileIOUtils.readAsUTFStringLines(FileIOUtils.java:88) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.readMarkersFromFile(MarkerUtils.java:208) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.lambda$readTimelineServerBasedMarkersFromFileSystem$141c8e72$1(MarkerUtils.java:185) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.fs.FSUtils.lambda$parallelizeFilesProcess$1f9929d5$1(FSUtils.java:700) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:68) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 48 more
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:347) ~[hadoop-common-2.10.1.jar:?]
at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:303) ~[hadoop-common-2.10.1.jar:?]
at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:252) ~[hadoop-common-2.10.1.jar:?]
at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:197) ~[hadoop-common-2.10.1.jar:?]
at java.io.DataInputStream.read(DataInputStream.java:149) ~[?:1.8.0_265]
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284) ~[?:1.8.0_265]
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326) ~[?:1.8.0_265]
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178) ~[?:1.8.0_265]
at java.io.InputStreamReader.read(InputStreamReader.java:184) ~[?:1.8.0_265]
at java.io.BufferedReader.fill(BufferedReader.java:161) ~[?:1.8.0_265]
at java.io.BufferedReader.readLine(BufferedReader.java:324) ~[?:1.8.0_265]
at java.io.BufferedReader.readLine(BufferedReader.java:389) ~[?:1.8.0_265]
at java.io.BufferedReader$1.hasNext(BufferedReader.java:571) ~[?:1.8.0_265]
at java.util.Iterator.forEachRemaining(Iterator.java:115) ~[?:1.8.0_265]
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) ~[?:1.8.0_265]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[?:1.8.0_265]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:1.8.0_265]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) ~[?:1.8.0_265]
at org.apache.hudi.common.util.FileIOUtils.readAsUTFStringLines(FileIOUtils.java:88) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.readMarkersFromFile(MarkerUtils.java:208) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.util.MarkerUtils.lambda$readTimelineServerBasedMarkersFromFileSystem$141c8e72$1(MarkerUtils.java:185) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.fs.FSUtils.lambda$parallelizeFilesProcess$1f9929d5$1(FSUtils.java:700) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:68) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 48 more
[2021-11-29 17:43:19,818] WARN [hudi-sink|task-0] Uncaught exception (io.javalin.core.ExceptionMapper:32)
org.apache.hudi.exception.HoodieException: Error occurs when executing mapToPair
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:70)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
at org.apache.hudi.client.common.HoodieJavaEngineContext.mapToPair(HoodieJavaEngineContext.java:130)
at org.apache.hudi.common.fs.FSUtils.parallelizeFilesProcess(FSUtils.java:699)
at org.apache.hudi.common.fs.FSUtils.parallelizeSubPathProcess(FSUtils.java:682)
at org.apache.hudi.common.util.MarkerUtils.readTimelineServerBasedMarkersFromFileSystem(MarkerUtils.java:180)
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.syncMarkersFromFileSystem(MarkerDirState.java:248)
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.<init>(MarkerDirState.java:100)
at org.apache.hudi.timeline.service.handlers.MarkerHandler.getMarkerDirState(MarkerHandler.java:188)
at org.apache.hudi.timeline.service.handlers.MarkerHandler.createMarker(MarkerHandler.java:156)
at org.apache.hudi.timeline.service.RequestHandler.lambda$registerMarkerAPI$27(RequestHandler.java:448)
at org.apache.hudi.timeline.service.RequestHandler$ViewHandler.handle(RequestHandler.java:497)
at io.javalin.security.SecurityUtil.noopAccessManager(SecurityUtil.kt:22)
at io.javalin.Javalin.lambda$addHandler$0(Javalin.java:606)
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:46)
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:17)
at io.javalin.core.JavalinServlet$service$1.invoke(JavalinServlet.kt:143)
at io.javalin.core.JavalinServlet$service$2.invoke(JavalinServlet.kt:41)
at io.javalin.core.JavalinServlet.service(JavalinServlet.kt:107)
at io.javalin.core.util.JettyServerUtil$initialize$httpHandler$1.doHandle(JettyServerUtil.kt:72)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at org.apache.hudi.org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
at org.apache.hudi.org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1668)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at org.apache.hudi.org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:61)
at org.apache.hudi.org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:174)
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.apache.hudi.org.eclipse.jetty.server.Server.handle(Server.java:502)
at org.apache.hudi.org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)
at org.apache.hudi.org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
at org.apache.hudi.org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
at org.apache.hudi.org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.apache.hudi.org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
at org.apache.hudi.org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.UncheckedIOException: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at java.io.BufferedReader$1.hasNext(BufferedReader.java:574)
at java.util.Iterator.forEachRemaining(Iterator.java:115)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
at org.apache.hudi.common.util.FileIOUtils.readAsUTFStringLines(FileIOUtils.java:88)
at org.apache.hudi.common.util.MarkerUtils.readMarkersFromFile(MarkerUtils.java:208)
at org.apache.hudi.common.util.MarkerUtils.lambda$readTimelineServerBasedMarkersFromFileSystem$141c8e72$1(MarkerUtils.java:185)
at org.apache.hudi.common.fs.FSUtils.lambda$parallelizeFilesProcess$1f9929d5$1(FSUtils.java:700)
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:68)
... 48 more
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:347)
at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:303)
at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:252)
at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:197)
at java.io.DataInputStream.read(DataInputStream.java:149)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
at java.io.BufferedReader.readLine(BufferedReader.java:389)
at java.io.BufferedReader$1.hasNext(BufferedReader.java:571)
... 60 more
[2021-11-29 17:43:19,818] WARN [hudi-sink|task-3] Uncaught exception (io.javalin.core.ExceptionMapper:32)
org.apache.hudi.exception.HoodieException: Error occurs when executing mapToPair
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:70)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
at org.apache.hudi.client.common.HoodieJavaEngineContext.mapToPair(HoodieJavaEngineContext.java:130)
at org.apache.hudi.common.fs.FSUtils.parallelizeFilesProcess(FSUtils.java:699)
at org.apache.hudi.common.fs.FSUtils.parallelizeSubPathProcess(FSUtils.java:682)
at org.apache.hudi.common.util.MarkerUtils.readTimelineServerBasedMarkersFromFileSystem(MarkerUtils.java:180)
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.syncMarkersFromFileSystem(MarkerDirState.java:248)
at org.apache.hudi.timeline.service.handlers.marker.MarkerDirState.<init>(MarkerDirState.java:100)
at org.apache.hudi.timeline.service.handlers.MarkerHandler.getMarkerDirState(MarkerHandler.java:188)
at org.apache.hudi.timeline.service.handlers.MarkerHandler.createMarker(MarkerHandler.java:156)
at org.apache.hudi.timeline.service.RequestHandler.lambda$registerMarkerAPI$27(RequestHandler.java:448)
at org.apache.hudi.timeline.service.RequestHandler$ViewHandler.handle(RequestHandler.java:497)
at io.javalin.security.SecurityUtil.noopAccessManager(SecurityUtil.kt:22)
at io.javalin.Javalin.lambda$addHandler$0(Javalin.java:606)
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:46)
at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:17)
at io.javalin.core.JavalinServlet$service$1.invoke(JavalinServlet.kt:143)
at io.javalin.core.JavalinServlet$service$2.invoke(JavalinServlet.kt:41)
at io.javalin.core.JavalinServlet.service(JavalinServlet.kt:107)
at io.javalin.core.util.JettyServerUtil$initialize$httpHandler$1.doHandle(JettyServerUtil.kt:72)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at org.apache.hudi.org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
at org.apache.hudi.org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1668)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at org.apache.hudi.org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
at org.apache.hudi.org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:61)
at org.apache.hudi.org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:174)
at org.apache.hudi.org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.apache.hudi.org.eclipse.jetty.server.Server.handle(Server.java:502)
at org.apache.hudi.org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)
at org.apache.hudi.org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
at org.apache.hudi.org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
at org.apache.hudi.org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.apache.hudi.org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
at org.apache.hudi.org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
at org.apache.hudi.org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
at org.apache.hudi.org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.UncheckedIOException: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at java.io.BufferedReader$1.hasNext(BufferedReader.java:574)
at java.util.Iterator.forEachRemaining(Iterator.java:115)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566)
at org.apache.hudi.common.util.FileIOUtils.readAsUTFStringLines(FileIOUtils.java:88)
at org.apache.hudi.common.util.MarkerUtils.readMarkersFromFile(MarkerUtils.java:208)
at org.apache.hudi.common.util.MarkerUtils.lambda$readTimelineServerBasedMarkersFromFileSystem$141c8e72$1(MarkerUtils.java:185)
at org.apache.hudi.common.fs.FSUtils.lambda$parallelizeFilesProcess$1f9929d5$1(FSUtils.java:700)
at org.apache.hudi.common.function.FunctionWrapper.lambda$throwingMapToPairWrapper$3(FunctionWrapper.java:68)
... 48 more
Caused by: org.apache.hadoop.fs.ChecksumException: Checksum error: file:/tmp/hoodie/hudi-test-topic/.hoodie/.temp/20211129174217738/MARKERS6 at 0 exp: -509813218 got: -1454124197
at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:347)
at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:303)
at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:252)
at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:197)
at java.io.DataInputStream.read(DataInputStream.java:149)
at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:284)
at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:326)
at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:178)
at java.io.InputStreamReader.read(InputStreamReader.java:184)
at java.io.BufferedReader.fill(BufferedReader.java:161)
at java.io.BufferedReader.readLine(BufferedReader.java:324)
at java.io.BufferedReader.readLine(BufferedReader.java:389)
at java.io.BufferedReader$1.hasNext(BufferedReader.java:571)
... 60 more
17:43:19.826 [task-thread-hudi-sink-3] ERROR org.apache.hudi.io.HoodieAppendHandle - Error in update task at commit 20211129174217738
org.apache.hudi.exception.HoodieRemoteException: Failed to create marker file partition_3/CB086FE9AB27F032E4E97C0644CCAE6B_0-0-0_20211129174019506.parquet.marker.APPEND
status code: 500, reason phrase: Server Error
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:149) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.WriteMarkers.create(WriteMarkers.java:65) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieWriteHandle.createMarkerFile(HoodieWriteHandle.java:181) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:174) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.HashMap.forEach(HashMap.java:1289) [?:1.8.0_265]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.AbstractConnectWriter.close(AbstractConnectWriter.java:95) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.handleEndCommit(ConnectTransactionParticipant.java:177) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.processRecords(ConnectTransactionParticipant.java:124) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.HoodieSinkTask.put(HoodieSinkTask.java:114) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:329) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:186) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:241) [connect-runtime-3.0.0.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_265]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_265]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_265]
Caused by: org.apache.http.client.HttpResponseException: status code: 500, reason phrase: Server Error
at org.apache.http.impl.client.AbstractResponseHandler.handleResponse(AbstractResponseHandler.java:70) ~[httpclient-4.5.13.jar:4.5.13]
at org.apache.http.client.fluent.Response.handleResponse(Response.java:90) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.http.client.fluent.Response.returnContent(Response.java:97) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.executeRequestToTimelineServer(TimelineServerBasedWriteMarkers.java:180) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:145) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 26 more
17:43:19.824 [task-thread-hudi-sink-0] ERROR org.apache.hudi.io.HoodieAppendHandle - Error in update task at commit 20211129174217738
org.apache.hudi.exception.HoodieRemoteException: Failed to create marker file partition_1/5ACAD57E1A83AF31F389CDA0BC3D595C_0-0-0_20211129174019506.parquet.marker.APPEND
status code: 500, reason phrase: Server Error
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:149) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.WriteMarkers.create(WriteMarkers.java:65) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieWriteHandle.createMarkerFile(HoodieWriteHandle.java:181) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:174) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.HashMap.forEach(HashMap.java:1289) [?:1.8.0_265]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.AbstractConnectWriter.close(AbstractConnectWriter.java:95) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.handleEndCommit(ConnectTransactionParticipant.java:177) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.processRecords(ConnectTransactionParticipant.java:124) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.HoodieSinkTask.put(HoodieSinkTask.java:114) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:329) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:186) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:241) [connect-runtime-3.0.0.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_265]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_265]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_265]
Caused by: org.apache.http.client.HttpResponseException: status code: 500, reason phrase: Server Error
at org.apache.http.impl.client.AbstractResponseHandler.handleResponse(AbstractResponseHandler.java:70) ~[httpclient-4.5.13.jar:4.5.13]
at org.apache.http.client.fluent.Response.handleResponse(Response.java:90) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.http.client.fluent.Response.returnContent(Response.java:97) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.executeRequestToTimelineServer(TimelineServerBasedWriteMarkers.java:180) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:145) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 26 more
17:43:19.827 [task-thread-hudi-sink-3] ERROR org.apache.hudi.connect.transaction.ConnectTransactionParticipant - Error writing records and ending commit 20211129174217738 for partition 3
org.apache.hudi.exception.HoodieIOException: Write records failed
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:125) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.AbstractConnectWriter.close(AbstractConnectWriter.java:95) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.handleEndCommit(ConnectTransactionParticipant.java:177) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.processRecords(ConnectTransactionParticipant.java:124) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.HoodieSinkTask.put(HoodieSinkTask.java:114) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:329) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:186) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:241) [connect-runtime-3.0.0.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_265]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_265]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_265]
Caused by: java.io.IOException: org.apache.hudi.exception.HoodieUpsertException: Failed to initialize HoodieAppendHandle for FileId: CB086FE9AB27F032E4E97C0644CCAE6B on commit 20211129174217738 on HDFS path file:/tmp/hoodie/hudi-test-topicpartition_3
... 16 more
Caused by: org.apache.hudi.exception.HoodieUpsertException: Failed to initialize HoodieAppendHandle for FileId: CB086FE9AB27F032E4E97C0644CCAE6B on commit 20211129174217738 on HDFS path file:/tmp/hoodie/hudi-test-topicpartition_3
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:181) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_265]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 15 more
Caused by: org.apache.hudi.exception.HoodieRemoteException: Failed to create marker file partition_3/CB086FE9AB27F032E4E97C0644CCAE6B_0-0-0_20211129174019506.parquet.marker.APPEND
status code: 500, reason phrase: Server Error
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:149) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.WriteMarkers.create(WriteMarkers.java:65) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieWriteHandle.createMarkerFile(HoodieWriteHandle.java:181) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:174) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_265]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 15 more
Caused by: org.apache.http.client.HttpResponseException: status code: 500, reason phrase: Server Error
at org.apache.http.impl.client.AbstractResponseHandler.handleResponse(AbstractResponseHandler.java:70) ~[httpclient-4.5.13.jar:4.5.13]
at org.apache.http.client.fluent.Response.handleResponse(Response.java:90) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.http.client.fluent.Response.returnContent(Response.java:97) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.executeRequestToTimelineServer(TimelineServerBasedWriteMarkers.java:180) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:145) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.WriteMarkers.create(WriteMarkers.java:65) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieWriteHandle.createMarkerFile(HoodieWriteHandle.java:181) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:174) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_265]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 15 more
17:43:19.827 [task-thread-hudi-sink-0] ERROR org.apache.hudi.connect.transaction.ConnectTransactionParticipant - Error writing records and ending commit 20211129174217738 for partition 0
org.apache.hudi.exception.HoodieIOException: Write records failed
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:125) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.AbstractConnectWriter.close(AbstractConnectWriter.java:95) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.handleEndCommit(ConnectTransactionParticipant.java:177) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.processRecords(ConnectTransactionParticipant.java:124) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.HoodieSinkTask.put(HoodieSinkTask.java:114) [hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:329) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:186) [connect-runtime-3.0.0.jar:?]
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:241) [connect-runtime-3.0.0.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_265]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_265]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_265]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_265]
Caused by: java.io.IOException: org.apache.hudi.exception.HoodieUpsertException: Failed to initialize HoodieAppendHandle for FileId: 5ACAD57E1A83AF31F389CDA0BC3D595C on commit 20211129174217738 on HDFS path file:/tmp/hoodie/hudi-test-topicpartition_1
... 16 more
Caused by: org.apache.hudi.exception.HoodieUpsertException: Failed to initialize HoodieAppendHandle for FileId: 5ACAD57E1A83AF31F389CDA0BC3D595C on commit 20211129174217738 on HDFS path file:/tmp/hoodie/hudi-test-topicpartition_1
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:181) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_265]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 15 more
Caused by: org.apache.hudi.exception.HoodieRemoteException: Failed to create marker file partition_1/5ACAD57E1A83AF31F389CDA0BC3D595C_0-0-0_20211129174019506.parquet.marker.APPEND
status code: 500, reason phrase: Server Error
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:149) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.WriteMarkers.create(WriteMarkers.java:65) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieWriteHandle.createMarkerFile(HoodieWriteHandle.java:181) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:174) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_265]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 15 more
Caused by: org.apache.http.client.HttpResponseException: status code: 500, reason phrase: Server Error
at org.apache.http.impl.client.AbstractResponseHandler.handleResponse(AbstractResponseHandler.java:70) ~[httpclient-4.5.13.jar:4.5.13]
at org.apache.http.client.fluent.Response.handleResponse(Response.java:90) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.http.client.fluent.Response.returnContent(Response.java:97) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.executeRequestToTimelineServer(TimelineServerBasedWriteMarkers.java:180) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:145) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.marker.WriteMarkers.create(WriteMarkers.java:65) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieWriteHandle.createMarkerFile(HoodieWriteHandle.java:181) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:174) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_265]
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112) ~[hudi-kafka-connect-bundle-0.10.0-rc2.jar:0.10.0-rc2]
... 15 more
[2021-11-29 17:43:19,831] ERROR [hudi-sink|task-0] WorkerSinkTask{id=hudi-sink-0} RetriableException from SinkTask: (org.apache.kafka.connect.runtime.WorkerSinkTask:600)
org.apache.kafka.connect.errors.RetriableException: Intermittent write errors for Hudi for the topic/partition: hudi-test-topic:0 , ensuring kafka connect will retry
at org.apache.hudi.connect.HoodieSinkTask.put(HoodieSinkTask.java:117)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:329)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:186)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:241)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hudi.exception.HoodieIOException: Error writing records and ending commit 20211129174217738 for partition 0
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.handleEndCommit(ConnectTransactionParticipant.java:198)
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.processRecords(ConnectTransactionParticipant.java:124)
at org.apache.hudi.connect.HoodieSinkTask.put(HoodieSinkTask.java:114)
... 11 more
Caused by: java.io.IOException: org.apache.hudi.exception.HoodieIOException: Write records failed
... 14 more
Caused by: org.apache.hudi.exception.HoodieIOException: Write records failed
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:125)
at org.apache.hudi.connect.writers.AbstractConnectWriter.close(AbstractConnectWriter.java:95)
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.handleEndCommit(ConnectTransactionParticipant.java:177)
... 13 more
Caused by: java.io.IOException: org.apache.hudi.exception.HoodieUpsertException: Failed to initialize HoodieAppendHandle for FileId: 5ACAD57E1A83AF31F389CDA0BC3D595C on commit 20211129174217738 on HDFS path file:/tmp/hoodie/hudi-test-topicpartition_1
... 16 more
Caused by: org.apache.hudi.exception.HoodieUpsertException: Failed to initialize HoodieAppendHandle for FileId: 5ACAD57E1A83AF31F389CDA0BC3D595C on commit 20211129174217738 on HDFS path file:/tmp/hoodie/hudi-test-topicpartition_1
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:181)
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349)
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83)
at java.util.HashMap.forEach(HashMap.java:1289)
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80)
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56)
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41)
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120)
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112)
... 15 more
Caused by: org.apache.hudi.exception.HoodieRemoteException: Failed to create marker file partition_1/5ACAD57E1A83AF31F389CDA0BC3D595C_0-0-0_20211129174019506.parquet.marker.APPEND
status code: 500, reason phrase: Server Error
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:149)
at org.apache.hudi.table.marker.WriteMarkers.create(WriteMarkers.java:65)
at org.apache.hudi.io.HoodieWriteHandle.createMarkerFile(HoodieWriteHandle.java:181)
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:174)
... 23 more
Caused by: org.apache.http.client.HttpResponseException: status code: 500, reason phrase: Server Error
at org.apache.http.impl.client.AbstractResponseHandler.handleResponse(AbstractResponseHandler.java:70)
at org.apache.http.client.fluent.Response.handleResponse(Response.java:90)
at org.apache.http.client.fluent.Response.returnContent(Response.java:97)
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.executeRequestToTimelineServer(TimelineServerBasedWriteMarkers.java:180)
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:145)
... 26 more
[2021-11-29 17:43:19,831] ERROR [hudi-sink|task-3] WorkerSinkTask{id=hudi-sink-3} RetriableException from SinkTask: (org.apache.kafka.connect.runtime.WorkerSinkTask:600)
org.apache.kafka.connect.errors.RetriableException: Intermittent write errors for Hudi for the topic/partition: hudi-test-topic:3 , ensuring kafka connect will retry
at org.apache.hudi.connect.HoodieSinkTask.put(HoodieSinkTask.java:117)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:329)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:186)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:241)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hudi.exception.HoodieIOException: Error writing records and ending commit 20211129174217738 for partition 3
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.handleEndCommit(ConnectTransactionParticipant.java:198)
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.processRecords(ConnectTransactionParticipant.java:124)
at org.apache.hudi.connect.HoodieSinkTask.put(HoodieSinkTask.java:114)
... 11 more
Caused by: java.io.IOException: org.apache.hudi.exception.HoodieIOException: Write records failed
... 14 more
Caused by: org.apache.hudi.exception.HoodieIOException: Write records failed
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:125)
at org.apache.hudi.connect.writers.AbstractConnectWriter.close(AbstractConnectWriter.java:95)
at org.apache.hudi.connect.transaction.ConnectTransactionParticipant.handleEndCommit(ConnectTransactionParticipant.java:177)
... 13 more
Caused by: java.io.IOException: org.apache.hudi.exception.HoodieUpsertException: Failed to initialize HoodieAppendHandle for FileId: CB086FE9AB27F032E4E97C0644CCAE6B on commit 20211129174217738 on HDFS path file:/tmp/hoodie/hudi-test-topicpartition_3
... 16 more
Caused by: org.apache.hudi.exception.HoodieUpsertException: Failed to initialize HoodieAppendHandle for FileId: CB086FE9AB27F032E4E97C0644CCAE6B on commit 20211129174217738 on HDFS path file:/tmp/hoodie/hudi-test-topicpartition_3
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:181)
at org.apache.hudi.io.HoodieAppendHandle.doAppend(HoodieAppendHandle.java:349)
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.lambda$execute$0(JavaUpsertPreppedDeltaCommitActionExecutor.java:83)
at java.util.HashMap.forEach(HashMap.java:1289)
at org.apache.hudi.table.action.deltacommit.JavaUpsertPreppedDeltaCommitActionExecutor.execute(JavaUpsertPreppedDeltaCommitActionExecutor.java:80)
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:56)
at org.apache.hudi.table.HoodieJavaMergeOnReadTable.upsertPrepped(HoodieJavaMergeOnReadTable.java:41)
at org.apache.hudi.client.HoodieJavaWriteClient.upsertPreppedRecords(HoodieJavaWriteClient.java:120)
at org.apache.hudi.connect.writers.BufferedConnectWriter.flushRecords(BufferedConnectWriter.java:112)
... 15 more
Caused by: org.apache.hudi.exception.HoodieRemoteException: Failed to create marker file partition_3/CB086FE9AB27F032E4E97C0644CCAE6B_0-0-0_20211129174019506.parquet.marker.APPEND
status code: 500, reason phrase: Server Error
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:149)
at org.apache.hudi.table.marker.WriteMarkers.create(WriteMarkers.java:65)
at org.apache.hudi.io.HoodieWriteHandle.createMarkerFile(HoodieWriteHandle.java:181)
at org.apache.hudi.io.HoodieAppendHandle.init(HoodieAppendHandle.java:174)
... 23 more
Caused by: org.apache.http.client.HttpResponseException: status code: 500, reason phrase: Server Error
at org.apache.http.impl.client.AbstractResponseHandler.handleResponse(AbstractResponseHandler.java:70)
at org.apache.http.client.fluent.Response.handleResponse(Response.java:90)
at org.apache.http.client.fluent.Response.returnContent(Response.java:97)
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.executeRequestToTimelineServer(TimelineServerBasedWriteMarkers.java:180)
at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.create(TimelineServerBasedWriteMarkers.java:145)
... 26 more
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment