Skip to content

Instantly share code, notes, and snippets.

Install DC/OS Packages: Elastic + Kibana

Tested with: DC/OS Enterprise 1.12.3 - Strict Mode

  • Install the DC/OS Enterprise CLI
dcos package install dcos-enterprise-cli --cli --yes
from pyspark import SparkContext
from pyspark.streaming import StreamingContext
from pyspark.streaming.kafka import KafkaUtils
sc = SparkContext(appName="StreamingExampleWithKafka")
ssc = StreamingContext(sc, 10)
opts = {"metadata.broker.list": "broker.kafka.l4lb.thisdcos.directory:9092"}
kvs = KafkaUtils.createDirectStream(ssc, ["mytopic"], opts)
lines = kvs.map(lambda x: x[1])
counts = lines.flatMap(lambda line: line.split(" ")) \
.map(lambda word: (word, 1)) \

Demo: CI/CD with TFC + Vault AppRole

Recommended Pattern for Vault AppRole Use | Vault - HashiCorp Learn Terraform Cloud Secrets Engine | Vault - HashiCorp Learn https://registry.terraform.io/providers/hashicorp/vault/latest/docs/data-sources/generic_secret

This demo aims to demonstrate how a CI/CD tool like GitLab or Jenkins could be used to broker trust for Vault by providing role IDs and wrapped secret IDs for the "build job" to consume. You can find the described pattern in the documentation .

Create AppRoles

Vault needs to be configured to create the AppRoles needed.

Demo: CI/CD with Vault AppRole

Recommended Pattern for Vault AppRole Use

This demo aims to demonstrate how a CI/CD tool like GitLab or Jenkins could be used to broker trust for Vault by providing role IDs and wrapped secret IDs for the "build job" to consume. You can find the described pattern in the documentation .

Create AppRoles

Vault needs to be configured to create the AppRoles needed.

Create a policy for the CI Controller: