Skip to content

Instantly share code, notes, and snippets.

@trisberg
Last active May 11, 2020 22:42
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save trisberg/4bf9b0bf59297ee15702793ef4ce3dba to your computer and use it in GitHub Desktop.
Save trisberg/4bf9b0bf59297ee15702793ef4ce3dba to your computer and use it in GitHub Desktop.

Installing Spring Cloud Data Flow with Ingress support

Using the latest 2.5.0.RELEASE version of the Helm Chart for Spring Cloud Data Flow.

Software prerequisites

Have the following installed:

Install Spring Cloud Data Flow Shell

Download the shell from https://repo.spring.io/libs-release/org/springframework/cloud/spring-cloud-dataflow-shell/2.5.0.RELEASE/spring-cloud-dataflow-shell-2.5.0.RELEASE.jar

Read more about using the shell in the documentation.

Kubernetes cluster

Create a cluster:

Allocate at least 4 cpus and 8GB of memory for the cluster.

Initialize the cluster for Helm v2

kubectl create serviceaccount tiller -n kube-system
kubectl create clusterrolebinding tiller --clusterrole cluster-admin --serviceaccount kube-system:tiller
helm init --wait --service-account tiller

Install NGINX Ingress Controller

On local clusters that don't provide support for LoadBalancer services we can still install NGINX Ingress Controller so we can access the services via an Ingress resource.

NGINX Ingress on GKE

Install NGINX Ingress using Helm 2:

kubectl create namespace nginx-ingress
helm repo update
helm install --name nginx-ingress --namespace nginx-ingress stable/nginx-ingress --wait

The NGINX ingress controller is exposed as LoadBalancer with external IP address.

Run the following to verify:

kubectl get services --namespace nginx-ingress

NGINX Ingress on Docker Desktop

NOTE: We are taking advantage of Docker Desktop supporting a single LoadBalancer service and exposing that on port 80 on localhost. To be able to use this feature it requires that you don't already have a service running on this port.

Install NGINX Ingress using Helm 2:

kubectl create namespace nginx-ingress
helm repo update
helm install --name nginx-ingress --namespace nginx-ingress stable/nginx-ingress --wait

The NGINX ingress controller is exposed as LoadBalancer with external IP address. For "Docker Desktop" it should be exposed on port 80 on localhost.

Run the following to verify:

kubectl get services --namespace nginx-ingress

NGINX Ingress on Minikube

Install NGINX Ingress using:

minikube addons enable ingress

The NGINX ingress controller is exposed on port 80 on the minikube ip address

Install Spring Cloud Data Flow using Helm

Look up ingress IP address

For GKE:

export INGRESS=$(kubectl get svc/nginx-ingress-controller -n nginx-ingress -ojsonpath='{.status.loadBalancer.ingress[0].ip}')

For Docker Desktop:

export INGRESS=127.0.0.1

For Minikube:

export INGRESS=$(minikube ip)

Install the Helm chart for Data Flow

We will install Data Flow in a scdf-system namespace, enable monitoring and set all services to use ClusterIP. We also enable the Ingress resource to use xip.io with the IP address we looked up above.

kubectl create namespace scdf-system
helm repo update
helm install --name scdf --namespace scdf-system \
  --set features.monitoring.enabled=true \
  --set server.service.type=ClusterIP \
  --set grafana.service.type=ClusterIP \
  --set prometheus.proxy.service.type=ClusterIP \
  --set ingress.enabled=true \
  --set ingress.protocol=http \
  --set ingress.server.host=scdf.${INGRESS}.xip.io \
  --set ingress.grafana.host=grafana.$INGRESS.xip.io \
  stable/spring-cloud-data-flow --wait

Connect using the shell

java -jar spring-cloud-dataflow-shell-2.5.0.RELEASE.jar --dataflow.uri=http://scdf.${INGRESS}.xip.io

Load some stream and task apps

Using the Data Flow Shell run the following commands:

dataflow:>app import https://dataflow.spring.io/rabbitmq-docker-latest
dataflow:>app import https://dataflow.spring.io/task-docker-latest

Create and deploy a simple stream

dataflow:>stream create --definition "time | log" --name ticktock
dataflow:>stream deploy ticktock
dataflow:>stream list

Create and launch a simple task

dataflow:>task create --definition "timestamp" --name mytime
dataflow:>task launch mytime
dataflow:>task list
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment