Skip to content

Instantly share code, notes, and snippets.

@picadoh
picadoh / cloudwatch.tf
Last active March 20, 2024 12:20
EC2 Instance Scheduling (Stop/Start) with Terraform
### Cloudwatch Events ###
# Event rule: Runs at 8pm during working days
resource "aws_cloudwatch_event_rule" "start_instances_event_rule" {
name = "start_instances_event_rule"
description = "Starts stopped EC2 instances"
schedule_expression = "cron(0 8 ? * MON-FRI *)"
depends_on = ["aws_lambda_function.ec2_start_scheduler_lambda"]
}
# Runs at 8am during working days
@picadoh
picadoh / docker-compose.yml
Last active September 7, 2019 09:57
Kafka Docker-Compose
version: '2'
services:
zookeeper:
image: wurstmeister/zookeeper:3.4.6
ports:
- "2181:2181"
kafka:
image: wurstmeister/kafka:0.11.0.1
ports:
- "9092:9092"
@picadoh
picadoh / KafkaConsumer.csproj
Last active November 22, 2017 21:16
Kafka Consumer for .NET Core
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp2.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Confluent.Kafka" Version="0.11.2" />
</ItemGroup>
</Project>
@picadoh
picadoh / KafkaProducer.csproj
Last active November 22, 2017 21:26
Kafka Producer for .NET Core
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp2.0</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Confluent.Kafka" Version="0.11.2" />
</ItemGroup>
</Project>
@picadoh
picadoh / event_app.sh
Last active May 9, 2017 11:52
KeyValueStores
#!/bin/bash
# storage functions
db_set () {
echo "$1,$2" >> event_store
}
db_get () {
grep "^$1," event_store | sed -e "s/^$1,//" | tail -n 1
}

Keybase proof

I hereby claim:

  • I am picadoh on github.
  • I am picadoh (https://keybase.io/picadoh) on keybase.
  • I have a public key whose fingerprint is A00B 4282 CCED A0A3 0809 90F4 7114 7A08 A373 B191

To claim this, I am signing this object:

@picadoh
picadoh / _secure_kafka_cluster_commands
Last active June 4, 2022 22:22
Secure Kafka Cluster
# env
export KAFKA_HOST="my.kafka.hostname"
export KAFKA_OPTS="-Djava.security.auth.login.config=/etc/kafka/kafka_server_jaas.conf"
# create topics
kafka-topics --create --topic securing-kafka --replication-factor 1 --partitions 3 --zookeeper $KAFKA_HOST:2181
# producer acl
kafka-acls --authorizer-properties zookeeper.connect=$KAFKA_HOST:2181 --add --allow-principal User:kafkaclient --producer --topic securing-kafka
@picadoh
picadoh / WordCounter.java
Last active January 2, 2021 00:45
Kafka Streams Example
import org.apache.kafka.common.serialization.Serde;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.KeyValue;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.kstream.KStreamBuilder;
import org.apache.kafka.streams.processor.WallclockTimestampExtractor;
import java.util.Properties;
import static java.util.Arrays.asList;
@picadoh
picadoh / StreamWordCounter.py
Created September 17, 2016 16:28
Spark Stateful Streaming with Python - Example that takes text from input network socket and prints the accumulated count for each word
# Spark Stateful Streaming with Python
# Takes text from input network socket and prints the accumulated count for each word
from pyspark import SparkContext
from pyspark.streaming import StreamingContext
# define the update function
def updateTotalCount(currentCount, countState):
if countState is None:
countState = 0