Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

@matzew
Last active October 17, 2018 01:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save matzew/b77acef9db9d8331e25a0f629c3d5d00 to your computer and use it in GitHub Desktop.
Save matzew/b77acef9db9d8331e25a0f629c3d5d00 to your computer and use it in GitHub Desktop.

Some Source and some different Channel in Knative Eventing

Note: Pseudo code...

Below is some yaml (and no implementation) for MySQL Binlog Source and Kafka Channel...

We need a ClusterProvisioner for our Source e.g. like the container provisioner from Nicolas (see #513):

apiVersion: eventing.knative.dev/v1alpha1
kind: ClusterProvisioner
metadata:
  name: container
spec:
  reconciles:
    group: eventing.knative.dev
    kind: Source

and another ClusterProvisioner for the Channel (e.g. the Kafka one from Sabari in #468)

apiVersion: eventing.knative.dev/v1alpha1
kind: ClusterProvisioner
metadata:
  name: kafka
spec:
  reconciles:
    group: eventing.knative.dev/v1alpha1
    kind: Channel

Now, that we have the two Provisioners together....

Next we would need a channel for Apache Kafka:

apiVersion: eventing.knative.dev/v1alpha1
kind: Channel
metadata:
  name: my-kafka-channel
spec:
  provisioner:
    ref:
      apiVersion: eventing.knative.dev/v1alpha1
      kind: ClusterProvisioner
      name: kafka
  arguments:
    args:
      bootstrapservers: "something.somewhere.com:9092"      

And we want to connect to MySQL on our source, so we might have:

apiVersion: eventing.knative.dev/v1alpha1
kind: Source
metadata:
  name: mysql-binlog
  namespace: default
spec:
  provisioner:
    ref:
      name: container
  channel:
    ref:
      kind: Channel
      apiVersion: eventing.knative.dev/v1alpha1
      name: my-kafka-channel
  arguments:
    image: docker.io/mrbean/mysqlsource
    args:
      mysql_srv: "something.somewhere.com:3306"
      ...

NOTE: The source here uses the channel ref. This assumes (I guess) that the channel is already provisioned. I agree with @scothis, that a source itself should NOT create the channel.

Now, we all the building blocks together. The source can write to the channel. In order to receive messages from the channel, we need to define a subscription our "service" to the channel:

apiVersion: eventing.knative.dev/v1alpha1
kind: Subscription
metadata:
  name: listentochannel
  namespace: default
spec:
  from:
    kind: Channel
    apiVersion: eventing.knative.dev/v1alpha1
    name: my-kafka-channel
  call:
    target:
      kind: Service
      apiVersion: serving.knative.dev/v1alpha1
      name: my-kafka-logger

With this example, MySQL changelog events would be written to a kafka topic, which can be consumed by a Knative serving function (receiving CloudEvents)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment