Serverless London Meetup
2022-07-20
Sarah, Lego group, Kinesis
Real-time data processing using kinesis and lambda
Intro: I'm Sarah, (twitter: serverlesssarah), work at Lego group.
2022-07-20
Real-time data processing using kinesis and lambda
Intro: I'm Sarah, (twitter: serverlesssarah), work at Lego group.
This is a script that takes a CSV file containing email addresses from a dirty data source, and splits it into two CSVs, one containing valid email addresses, and the other invalid.
First, clone the alphagov/notifications-utils repository. Add this script to the root, and make it executable:
chmod +x validate-emails.py
Call it like this:
dhall text <<< './sqs-broker-template.dhall "myQueueName" (toMap {tag1 = "foo", tag2 = "bar"}) False'
(change False
to True
for a FIFO queue)
--- | |
apiVersion: config.istio.io/v1alpha2 | |
kind: metric | |
metadata: | |
labels: | |
app: canary | |
owner: canary-hack | |
name: requestcountbypath | |
namespace: sandbox-main | |
spec: |
package main | |
import ( | |
"fmt" | |
"log" | |
"github.com/ugorji/go/codec" | |
) | |
func main() { |
scrape_configs: | |
- job_name: 'paas' | |
scheme: https | |
static_configs: | |
- targets: ['guid:0','guid:1'] | |
labels: | |
job: foo | |
space: my-space |
require 'octokit' | |
client = Octokit::Client.new(access_token: "<TOKEN>") | |
repos = client.repos('gds-attic') | |
repos.take(10).each do |repo| | |
puts "archiving #{repo.url}..." | |
client.post( | |
"#{repo.url}/transfer", |