Skip to content

Instantly share code, notes, and snippets.

Robin Moffatt rmoff

Block or report user

Report or block rmoff

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
@rmoff
rmoff / kafkacat.adoc
Last active Jan 17, 2020
Show last three messages from a Kafka topic with kafkacat
View kafkacat.adoc
kafkacat -b localhost:9092 \
         -t _kafka-connect-group-01-status \
         -C \
         -o-3 \
         -c3 \
         -f 'Topic %t / Partition %p / Offset: %o / Timestamp: %T\nHeaders: %h\nKey (%K bytes): %k\nPayload (%S bytes): %s\n--\n'
@rmoff
rmoff / docker-compose.yml
Last active Dec 2, 2019
ksqlDB with external Kafka Connect worker and installed connector plugin
View docker-compose.yml
---
version: '3'
services:
zookeeper:
image: confluentinc/cp-zookeeper:5.3.1
container_name: zookeeper
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
@rmoff
rmoff / docker-compose.yml
Last active Dec 2, 2019
ksqlDB with external Kafka Connect worker
View docker-compose.yml
---
version: '3'
services:
zookeeper:
image: confluentinc/cp-zookeeper:5.3.1
container_name: zookeeper
environment:
ZOOKEEPER_CLIENT_PORT: 2181
ZOOKEEPER_TICK_TIME: 2000
@rmoff
rmoff / ccloud_connect.adoc
Created Sep 26, 2019
Configuring Kafka Connect with Confluent Cloud
View ccloud_connect.adoc

Configuring Kafka Connect with Confluent Cloud

So you want to connect Kafka Connect to Confluent Cloud? Here’s how.

Overview

  1. Configure the worker

  2. Pre-create any topics that your source connector will write to (since auto-topic creation is not enabled on Confluent Cloud)

View Single Message Transform to add topic, partition, offset to sink message
curl -i -X PUT -H "Content-Type:application/json" \
http://localhost:8083/connectors/sink-elastic-orders-00/config \
-d '{
"connector.class": "io.confluent.connect.elasticsearch.ElasticsearchSinkConnector",
"topics": "orders",
"connection.url": "http://elasticsearch:9200",
"type.name": "type.name=kafkaconnect",
"key.ignore": "true",
"schema.ignore": "false",
"errors.tolerance":"all",
@rmoff
rmoff / docker-compose.yml
Last active May 22, 2019
Example Docker Compose showing Apache Kafka listener configuration
View docker-compose.yml
[…]
kafka0:
image: "confluentinc/cp-enterprise-kafka:5.2.1"
ports:
- '9092:9092'
- '29094:29094'
depends_on:
- zookeeper
environment:
@rmoff
rmoff / connector-delete.sh
Created May 1, 2019
Selectively delete Kafka Connect connectors using peco
View connector-delete.sh
curl -s "http://localhost:8083/connectors"| \
jq '.[]'| \
peco | \
xargs -I{connector_name} curl -s -XDELETE "http://localhost:8083/connectors/"{connector_name}
@rmoff
rmoff / connect_status.sh
Created May 1, 2019
Show status of all Kafka Connect connectors
View connect_status.sh
curl -s "http://localhost:8083/connectors"| \
jq '.[]'| \
xargs -I{connector_name} curl -s "http://localhost:8083/connectors/"{connector_name}"/status" | \
jq -c -M '[.name,.connector.state,.tasks[].state]|join(":|:")' | \
column -s : -t | \
sed 's/\"//g' | \
sort
@rmoff
rmoff / gist:c3db21aae4b9232b07699fbe07e10d68
Created Mar 13, 2019
Neo4j/twitter Kafka Connect demo
View gist:c3db21aae4b9232b07699fbe07e10d68
= Testing the Neo4j Connector with Twitter as a source of data
Caveat: I have never used Neo4j before. I'm familiar with the general concept of a property graph though.
Environment: https://github.com/rmoff/neo4j-streams/blob/master/kafka-connect-neo4j/docker/docker-compose.yml
== Start here
Create a Twitter source
View example.pcap
{
"timestamp": "1547735265961",
"layers": {
"frame": {
"frame_frame_interface_id": "0",
"frame_interface_id_frame_interface_name": "en0",
"frame_frame_encap_type": "1",
"frame_frame_time": "Jan 17, 2019 14:27:45.961581000 GMT",
"frame_frame_offset_shift": "0.000000000",
"frame_frame_time_epoch": "1547735265.961581000",
You can’t perform that action at this time.