Skip to content

Instantly share code, notes, and snippets.

@dalelane
Created March 13, 2024 20:22
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save dalelane/d4d7c5d7689322287ba7d53a3a148b39 to your computer and use it in GitHub Desktop.
Save dalelane/d4d7c5d7689322287ba7d53a3a148b39 to your computer and use it in GitHub Desktop.
Using the Event Streams Producer API to produce data from a folder, one message per file
#!/bin/sh
# the URL from Step 1
URL=https://my-kafka-cluster-ibm-es-recapi-external-event-automation.apps.15eb31a112f5c312116f22c1.cloud.techzone.ibm.com
# the credentials from Step 2
USERNAME=demo-username
PASSWORD=tfmA896q8l3rCcbqa15k4NOcbn8Zm5ir
# the folder with test files you created in Step 3
TEST_DATA_DIR=data
# the topic to produce messages to
TOPIC=MYTOPIC
for file in $TEST_DATA_DIR/*
do
echo $file
curl \
--silent \
-X POST \
-H "Content-Type: text/plain" \
-k \
--data @$file \
-u $USERNAME:$PASSWORD \
$URL/topics/$TOPIC/records > /dev/null
done
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment