Skip to content

Instantly share code, notes, and snippets.

@dacr
Last active April 2, 2023 10:11
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save dacr/cdadbbe5991aca119a0743cc613b2802 to your computer and use it in GitHub Desktop.
Save dacr/cdadbbe5991aca119a0743cc613b2802 to your computer and use it in GitHub Desktop.
kafka public cheat sheet / published by https://github.com/dacr/code-examples-manager #983ba075-2b9f-4c08-93b0-01bcc7482f69/f931498c36bb5ce238ca34c0f30bec599bc6778f

kafka public cheat sheet

resources

notes

  • AVRO most used serialization protocol
    • with Schema registry => allows more compact data serialisation with versioning in bonus
  • easy way to use avro tools :
    • cs launch org.apache.avro:avro-tools:1.10.2 -M org.apache.avro.tool.Main -- idl file.avdl

Using tgz packaging

install

mkdir $HOME/kafka
cd $HOME/kafka
curl -o kafka.tar.gz https://mirrors.ircam.fr/pub/apache/kafka/2.8.0/kafka_2.13-2.8.0.tgz
tar xvfz kaftar.tar.gz
ln -s kafka_* current

env & aliases

export KAFKA_HOME=$HOME/kafka/current
export KAFKA_BIN=$KAFKA_HOME/bin
export PATH=$PATH:$KAFKA_BIN
export MYIP=$(hostname -I | awk '{print $1}')
export KAFKA_IP=127.0.0.1
export KAFKA_ENDPOINT=$KAFKA_IP:9092

alias topics="$KAFKA_BIN/kafka-topics.sh --bootstrap-server $KAFKA_ENDPOINT"
alias console-producer="$KAFKA_BIN/kafka-console-producer.sh --broker-list $KAFKA_ENDPOINT"
alias console-consumer="$KAFKA_BIN/kafka-console-consumer.sh --bootstrap-server $KAFKA_ENDPOINT --timeout-ms 5000"
alias consumer-groups="$KAFKA_BIN/kafka-consumer-groups.sh --bootstrap-server $KAFKA_ENDPOINT"
alias configs="$KAFKA_BIN/kafka-configs.sh --bootstrap-server $KAFKA_ENDPOINT"

start

With zookeeper

nohup $KAFKA_BIN/zookeeper-server-start.sh $KAFKA_HOME/config/zookeeper.properties > zookeeper.nohup &
echo $! > zookeeper.pid

nohup $KAFKA_BIN/kafka-server-start.sh $KAFKA_HOME/config/server.properties > kafka.nohup &
echo $! > kafka.pid

Without zookeeper (not yet ready for production)

CLUSTER_ID=$(kafka-storage.sh random-uuid)
kafka-storage.sh  format --config server.properties --cluster-id $CLUSTER_ID

curl -o server.properties https://raw.githubusercontent.com/apache/kafka/6d1d68617ecd023b787f54aafc24a4232663428d/config/kraft/server.properties
MYIP=$(hostname -I | awk '{print $1}')
sed -i -E "s@advertised.listeners=PLAINTEXT://localhost:9092@advertised.listeners=PLAINTEXT://$MYIP:9092@" server.properties

nohup $KAFKA_BIN/kafka-server-start.sh server.properties > kafka.nohup &
echo $! > kafka.pid

basic usage


topics --create --replication-factor 1 --partitions 1 --topic test
topics --list

echo '{"message":"Hello world"}' | console-producer --topic test

console-consumer --from-beginning --topic test --group myconsumer
console-consumer --from-beginning --topic test --group myconsumer

consumer-groups --list
consumer-groups --describe --group myconsumer

consumer-groups --execute --reset-offsets --to-earliest --group myconsumer --all-topics

console-consumer --from-beginning  --topic test --group myconsumer

configs --describe  --topic test --all

topics --delete --topic test

User interfaces

kafdrop

MYIP=$(hostname -I | awk '{print $1}')

docker run -it -p 9000:9000 \
  -e KAFKA_BROKERCONNECT=$MYIP:9092 \
  obsidiandynamics/kafdrop

Visit -> http://127.0.0.1:9000/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment