Skip to content

Instantly share code, notes, and snippets.

View MrMikeFloyd's full-sized avatar
🦄
living, learning, coding.

Maik Fleuter MrMikeFloyd

🦄
living, learning, coding.
  • Munich, DE
View GitHub Profile
@MrMikeFloyd
MrMikeFloyd / taskdef-prod.json
Created May 23, 2022 07:55
ECS task definition for automated CodeDeploy deployment. The execution role ARN needs to be replaced before checking this file into version control for the deployment to succeed. Find the most recent version here: https://github.com/codecentric/accelerate-kickstarter-aws/blob/main/cloud-bootstrap-app/taskdef-prod.json.template
{
"family": "cloud-bootstrap",
"executionRoleArn": "${TASK_EXEC_ROLE_ARN}",
"networkMode": "awsvpc",
"cpu": "1024",
"memory": "2048",
"requiresCompatibilities": [
"FARGATE"
],
"containerDefinitions": [
@MrMikeFloyd
MrMikeFloyd / appspec-prod.yml
Created May 23, 2022 07:51
CodeDeploy application specification for automated ECS deployments. The task definition will be injected during deployment. Find the most recent version here: https://github.com/codecentric/accelerate-kickstarter-aws/blob/main/cloud-bootstrap-app/appspec-prod.yaml
version: 0.0
Resources:
- TargetService:
Type: AWS::ECS::Service
Properties:
TaskDefinition: <TASK_DEFINITION>
LoadBalancerInfo:
ContainerName: "cloud-bootstrap"
ContainerPort: 8080
@MrMikeFloyd
MrMikeFloyd / buildspec.yml
Created May 23, 2022 07:19
AWS CodeBuild build specification file to be included in the repo of the artifact we'd like built. Find the most recent version here: https://github.com/codecentric/accelerate-kickstarter-aws/blob/main/cloud-bootstrap-app/buildspec.yml
version: 0.2
runtime-versions:
java: openjdk8
phases:
install:
runtime-versions:
docker: 18
pre_build:
commands:
- echo Logging in to Amazon ECR...
spring:
application:
name: kafka-telemetry-data-consumer
# Ignore type headers in kafka message
kafka.properties.spring.json.use.type.headers: false
cloud:
stream:
kafka:
binder:
brokers: "localhost:29092"
@MrMikeFloyd
MrMikeFloyd / KafkaStreamsHandler.kt
Created December 15, 2021 12:40
Function declaration that contains all the logic to split and aggregate the probe telemetry data. Find the most recent version here: https://github.com/codecentric/spring-kafka-streams-example/blob/main/kafka-samples-streams/src/main/kotlin/com/example/kafkasamplesstreams/KafkaStreamsHandler.kt
package com.example.kafkasamplesstreams
import com.example.kafkasamplesstreams.events.AggregatedTelemetryData
import com.example.kafkasamplesstreams.events.SpaceAgency
import com.example.kafkasamplesstreams.events.TelemetryDataPoint
import com.example.kafkasamplesstreams.serdes.AggregateTelemetryDataSerde
import mu.KotlinLogging
import org.apache.kafka.common.serialization.Serdes
import org.apache.kafka.streams.kstream.KStream
import org.apache.kafka.streams.kstream.Materialized
spring:
kafka.properties.spring.json.use.type.headers: false
application:
name: kafka-telemetry-data-aggregator
cloud:
function:
definition: aggregateTelemetryData
stream:
bindings:
aggregateTelemetryData-in-0:
package de.codecentric.samples.kafkasamplesproducer
import de.codecentric.samples.kafkasamplesproducer.event.TelemetryData
import mu.KotlinLogging
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.cloud.stream.function.StreamBridge
import org.springframework.kafka.support.KafkaHeaders
import org.springframework.messaging.support.MessageBuilder
import org.springframework.stereotype.Component
spring:
application:
name: kafka-telemetry-data-producer
cloud:
stream:
kafka:
binder:
brokers: "localhost:29092"
bindings:
telemetry-data-out-0:
@MrMikeFloyd
MrMikeFloyd / plot-response-performance.py
Created June 30, 2021 07:13
Python script that reads a file and plots its records.
import matplotlib.pyplot as plt
def read_file():
with open("response-times-records.txt") as f:
# This assumes that lines are formatted like so:
# 42.123456|2021-07-01 12:00:00.000000
return [line.rstrip().split("|") for line in f]
def plot_response_times():
records = read_file()
@MrMikeFloyd
MrMikeFloyd / graphql-to-s3-lambda.py
Last active June 29, 2021 13:07
Sample AWS Lambda (Python) to authenticate against a GraphQL db, printing the result, and storing the query duration in a S3 bucket. The credentials are taken from SSM parameter store. In order to make external libraries such as requests available at execution time, these need to be packaged beforehand (using virtualenv and the like).
import json
import urllib.parse
import boto3
import requests
import time
import datetime
ALBUM_ID = "REPLACE_ME_ALBUM_ID"
S3_BUCKET = "REPLACE_ME_BUCKET_NAME"
S3_KEY = "REPLACE_ME_BUCKET_OBJECT_NAME"