Skip to content

Instantly share code, notes, and snippets.

@jldec
Last active November 26, 2019 21:39
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jldec/4db56866b6405ccb08b669baff0664fe to your computer and use it in GitHub Desktop.
Save jldec/4db56866b6405ccb08b669baff0664fe to your computer and use it in GitHub Desktop.
riff v0.5-snapshot streaming demo

install riff using kapp

riff-kapp-install.sh

install a snapshot build of riff cli

make clean build install

install kafka

kubectl apply -f ~/riff/kafka-broker.yaml

configure the streaming runtime with a KafkaProvider

riff streaming kafka-provider create franz --bootstrap-servers kafkabroker:9092

configure local registry at registry.pfs.svc.cluster.local:5000

auto-restart mean you only have to do this once

docker run -d -p 5000:5000 --restart=always -v ~/.registry/storage:/var/lib/registry registry:2

This part is necessary every time you refresh the cluster

kubectl create namespace pfs
kubectl create service externalname registry -n pfs --external-name=host.docker.internal --tcp=5000:5000

export REGISTRY_USER=testuser
export REGISTRY=registry.pfs.svc.cluster.local:5000

riff credential apply my-pfs \
--registry $REGISTRY \
--registry-user $REGISTRY_USER \
--default-image-prefix $REGISTRY/$REGISTRY_USER

build square

riff function create square \
  --git-repo https://github.com/jldec/node-fun-square \
  --artifact square.js \
  --tail
# OR USING --local-path
cd ~/riff/node-fun-square 

riff function create square \
  --local-path ~/riff/node-fun-square \
  --artifact square.js \
  --tail

create in and out streams

riff streaming stream create in  --provider franz-kafka-provisioner --content-type application/json 
riff streaming stream create out --provider franz-kafka-provisioner --content-type application/json 

wire square to in and out streams

riff streaming processor create square \
  --function-ref square \
  --input in \
  --output out

create a container and core deployer for the http source

riff container create http-source-container --image jldec/http-source:v0.0.1

export KAFKA_GATEWAY=$(kubectl get stream in -o jsonpath={.status.address.gateway})
echo KAFKA_GATEWAY=$KAFKA_GATEWAY

riff core deployer create http-source --container-ref http-source-container \
  --env OUTPUTS=/in=$KAFKA_GATEWAY/default_in,/out=$KAFKA_GATEWAY/default_out \
  --env OUTPUT_CONTENT_TYPES=application/json,application/json

create random and deploy on core runtime

riff container create random --image jldec/random:v0.0.2
riff core deployer create core-random --container-ref random --tail

port forward to random (using separate terminal)

kubectl port-forward service/`kubectl get service -l core.projectriff.io/deployer=core-random -o jsonpath='{.items[0].metadata.name}'` 8081:80

configure random to send random numbers to /in on the http-source

export HTTP_SOURCE=$(kc get deployer http-source -o jsonpath={.status.serviceName})
echo HTTP_SOURCE=$HTTP_SOURCE

curl http://localhost:8081/ -w '\n' \
-H 'Content-Type: application/json' \
-d "{\"url\":\"http://$HTTP_SOURCE/in\", \"freq\":3}"

build echo function (also exports console.log)

riff function create echo \
  --local-path ~/riff/echo \
  --artifact echo.js \
  --tail

create echo-out stream and wire echo as a new processor from out to echo-out

riff streaming stream create echo-out --provider franz-kafka-provisioner --content-type application/json 

riff streaming processor create echo-out \
  --function-ref echo \
  --input out \
  --output echo-out
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment