Skip to content

Instantly share code, notes, and snippets.

Avatar

Confluent, Inc. confluentgist

View GitHub Profile
@confluentgist
confluentgist / users.py
Created Jul 6, 2020
A script to produce randomized user records into a Kafka topic
View users.py
import sys
import time
import os
import json
import random
from dateutil import parser
from confluent_kafka import Producer
import names # https://pypi.org/project/names/
import random
@confluentgist
confluentgist / build_instr_bin.sh
Created Apr 16, 2020
Bincover - Build instrumented binary
View build_instr_bin.sh
#!/usr/bin/env bash
go test . -tags testbincover -coverpkg=./... -c -o instr_bin -ldflags="-X github.com/confluentinc/bincover/examples/echo-arg.isTest=true"
@confluentgist
confluentgist / main.go
Last active Apr 22, 2020
Bincover example application
View main.go
package main
import (
"fmt"
"os"
"strconv"
"github.com/confluentinc/bincover"
)
@confluentgist
confluentgist / main_test.go
Last active Apr 22, 2020
Bincover example main method test
View main_test.go
package main
import (
"fmt"
"log"
"os"
"os/exec"
"regexp"
"testing"
View docker-compose.yml
---
version: '2'
services:
zookeeper:
image: confluentinc/cp-zookeeper:5.3.1
hostname: zookeeper
container_name: zookeeper
ports:
- "2181:2181"
View Streaming_Machine_Learning_with_Kafka_and_TensorFlow.py
import numpy as np
import tensorflow as tf
import tensorflow_io.kafka as kafka_io
# 1. Consume streaming data with Kafka and TensorFlow I/O
def func_x(x):
# Decode image to (28, 28)
x = tf.io.decode_raw(x, out_type=tf.uint8)
x = tf.reshape(x, [28, 28])
# Convert to float32 for tf.keras
@confluentgist
confluentgist / increased-partitions.sql
Created Jan 11, 2020 — forked from miguno/increased-partitions.sql
ksqlDB example: Create a new stream with the desired number of partitions.
View increased-partitions.sql
CREATE STREAM products ...;
CREATE STREAM products_repartitioned
WITH (PARTITIONS=30) AS
SELECT * FROM products
EMIT CHANGES;
@confluentgist
confluentgist / aggregation.java
Created Jan 9, 2020 — forked from miguno/aggregation.java
Kafka Streams Example: Continuously aggregating a stream into a table
View aggregation.java
// Continuously aggregating a KStream into a KTable.
KStream<String, String> locationUpdatesStream = ...;
KTable<String, Long> locationsPerUser
= locationUpdatesStream
.groupBy((k, v) -> v.username)
.count();
@confluentgist
confluentgist / aggregation.sql
Created Jan 9, 2020 — forked from miguno/aggregation.sql
ksqlDB example: Continuously aggregating a stream into a table with a push query
View aggregation.sql
-- Continuously aggregating a stream into a table with a ksqlDB push query.
CREATE STREAM locationUpdatesStream ...;
CREATE TABLE locationsPerUser AS
SELECT username, COUNT(*)
FROM locationUpdatesStream
GROUP BY username
EMIT CHANGES;
@confluentgist
confluentgist / topic-as-table.java
Created Jan 9, 2020 — forked from miguno/topic-as-table.java
Kafka Streams Example: read topic as table
View topic-as-table.java
// Create KTable from Kafka topic.
KTable<String, String> table = builder.table("input-topic", Consumed.with(Serdes.String(), Serdes.String()));
You can’t perform that action at this time.