Skip to content

Instantly share code, notes, and snippets.

View confluentgist's full-sized avatar

Confluent, Inc. confluentgist

View GitHub Profile
@confluentgist
confluentgist / users.py
Created July 6, 2020 17:22
A script to produce randomized user records into a Kafka topic
import sys
import time
import os
import json
import random
from dateutil import parser
from confluent_kafka import Producer
import names # https://pypi.org/project/names/
import random
@confluentgist
confluentgist / build_instr_bin.sh
Created April 16, 2020 20:47
Bincover - Build instrumented binary
#!/usr/bin/env bash
go test . -tags testbincover -coverpkg=./... -c -o instr_bin -ldflags="-X github.com/confluentinc/bincover/examples/echo-arg.isTest=true"
@confluentgist
confluentgist / main.go
Last active April 22, 2020 17:55
Bincover example application
package main
import (
"fmt"
"os"
"strconv"
"github.com/confluentinc/bincover"
)
@confluentgist
confluentgist / main_test.go
Last active April 22, 2020 17:55
Bincover example main method test
package main
import (
"fmt"
"log"
"os"
"os/exec"
"regexp"
"testing"
---
version: '2'
services:
zookeeper:
image: confluentinc/cp-zookeeper:5.3.1
hostname: zookeeper
container_name: zookeeper
ports:
- "2181:2181"
import numpy as np
import tensorflow as tf
import tensorflow_io.kafka as kafka_io
# 1. Consume streaming data with Kafka and TensorFlow I/O
def func_x(x):
# Decode image to (28, 28)
x = tf.io.decode_raw(x, out_type=tf.uint8)
x = tf.reshape(x, [28, 28])
# Convert to float32 for tf.keras
@confluentgist
confluentgist / increased-partitions.sql
Created January 11, 2020 00:14 — forked from miguno/increased-partitions.sql
ksqlDB example: Create a new stream with the desired number of partitions.
CREATE STREAM products ...;
CREATE STREAM products_repartitioned
WITH (PARTITIONS=30) AS
SELECT * FROM products
EMIT CHANGES;
@confluentgist
confluentgist / aggregation.java
Created January 9, 2020 22:16 — forked from miguno/aggregation.java
Kafka Streams Example: Continuously aggregating a stream into a table
// Continuously aggregating a KStream into a KTable.
KStream<String, String> locationUpdatesStream = ...;
KTable<String, Long> locationsPerUser
= locationUpdatesStream
.groupBy((k, v) -> v.username)
.count();
@confluentgist
confluentgist / aggregation.sql
Created January 9, 2020 22:15 — forked from miguno/aggregation.sql
ksqlDB example: Continuously aggregating a stream into a table with a push query
-- Continuously aggregating a stream into a table with a ksqlDB push query.
CREATE STREAM locationUpdatesStream ...;
CREATE TABLE locationsPerUser AS
SELECT username, COUNT(*)
FROM locationUpdatesStream
GROUP BY username
EMIT CHANGES;
@confluentgist
confluentgist / topic-as-table.java
Created January 9, 2020 07:56 — forked from miguno/topic-as-table.java
Kafka Streams Example: read topic as table
// Create KTable from Kafka topic.
KTable<String, String> table = builder.table("input-topic", Consumed.with(Serdes.String(), Serdes.String()));