Skip to content

Instantly share code, notes, and snippets.

Demo: CI/CD with Vault AppRole

Recommended Pattern for Vault AppRole Use

This demo aims to demonstrate how a CI/CD tool like GitLab or Jenkins could be used to broker trust for Vault by providing role IDs and wrapped secret IDs for the "build job" to consume. You can find the described pattern in the documentation .

Create AppRoles

Vault needs to be configured to create the AppRoles needed.

Create a policy for the CI Controller:

Demo: CI/CD with TFC + Vault AppRole

Recommended Pattern for Vault AppRole Use | Vault - HashiCorp Learn Terraform Cloud Secrets Engine | Vault - HashiCorp Learn https://registry.terraform.io/providers/hashicorp/vault/latest/docs/data-sources/generic_secret

This demo aims to demonstrate how a CI/CD tool like GitLab or Jenkins could be used to broker trust for Vault by providing role IDs and wrapped secret IDs for the "build job" to consume. You can find the described pattern in the documentation .

Create AppRoles

Vault needs to be configured to create the AppRoles needed.

from pyspark import SparkContext
from pyspark.streaming import StreamingContext
from pyspark.streaming.kafka import KafkaUtils
sc = SparkContext(appName="StreamingExampleWithKafka")
ssc = StreamingContext(sc, 10)
opts = {"metadata.broker.list": "broker.kafka.l4lb.thisdcos.directory:9092"}
kvs = KafkaUtils.createDirectStream(ssc, ["mytopic"], opts)
lines = kvs.map(lambda x: x[1])
counts = lines.flatMap(lambda line: line.split(" ")) \
.map(lambda word: (word, 1)) \

Install DC/OS Packages: Elastic + Kibana

Tested with: DC/OS Enterprise 1.12.3 - Strict Mode

  • Install the DC/OS Enterprise CLI
dcos package install dcos-enterprise-cli --cli --yes
from pyspark import SparkContext
from pyspark.streaming import StreamingContext
from pyspark.streaming.kafka import KafkaUtils
sc = SparkContext(appName="StreamingExampleWithKafka")
ssc = StreamingContext(sc, 10)
opts = {"metadata.broker.list": "kafka-0-broker.kafka.autoip.dcos.thisdcos.directory:1025"}
kvs = KafkaUtils.createDirectStream(ssc, ["mytopic"], opts)
lines = kvs.map(lambda x: x[1])
counts = lines.flatMap(lambda line: line.split(" ")) \
.map(lambda word: (word, 1)) \
{
"id": "pgdump",
"run": {
"cpus": 0.5,
"mem": 512,
"disk": 0,
"cmd": "pg_dump --dbname=$PGDB --host=$PGHOST --port=$PGPORT --username=$PGUSER > $BACKUP_FOLDER/$PGHOST-$PGDB-`date +\\%Y-\\%m-\\%d-\\%H-\\%M-\\%S`.sql && gzip $BACKUP_FOLDER/*",
"docker": {
"image": "postgres:9.6"
},
{
"id": "/postgresql",
"backoffFactor": 1.15,
"backoffSeconds": 1,
"container": {
"portMappings": [
{
"containerPort": 5432,
"hostPort": 0,
"labels": {
{
"id": "/postgresql",
"backoffFactor": 1.15,
"backoffSeconds": 1,
"container": {
"portMappings": [
{
"containerPort": 5432,
"hostPort": 0,
"labels": {
@jrx
jrx / kong.json
Created September 26, 2017 06:27
Kong with Host Ports
{
"id": "/kong",
"acceptedResourceRoles": [
"slave_public"
],
"backoffFactor": 1.15,
"backoffSeconds": 1,
"cmd": "export KONG_NGINX_DAEMON=\"off\" && kong migrations up && kong prepare -p /usr/local/kong/ && /usr/local/openresty/nginx/sbin/nginx -c /usr/local/kong/nginx.conf -p /usr/local/kong/ ",
"container": {
"portMappings": [
We the People of the United States, in Order to form a more perfect Union,
establish Justice, insure domestic Tranquility, provide for the common
defence, promote the general Welfare, and secure the Blessings of Liberty to
ourselves and our Posterity, do ordain and establish this Constitution for the
United States of America.
Article 1.
Section 1
All legislative Powers herein granted shall be vested in a Congress of the