Skip to content

Instantly share code, notes, and snippets.

💭
Happily Coding!

Gary A. Stafford garystafford

💭
Happily Coding!
Block or report user

Report or block garystafford

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
View sam-local-invoke.sh
# variables (required by local lambda functions)
TABLE_NAME=your-dynamodb-table-name
# local testing (All CRUD functions)
sam local invoke PostMessageFunction \
--event lambda_apigtw_to_dynamodb/events/event_postMessage.json
sam local invoke GetMessageFunction \
--event lambda_apigtw_to_dynamodb/events/event_getMessage.json
sam local invoke GetMessagesFunction \
--event lambda_apigtw_to_dynamodb/events/event_getMessages.json
View event_getMessage.json
{
"body": "",
"resource": "/",
"path": "/message",
"httpMethod": "GET",
"isBase64Encoded": false,
"queryStringParameters": {
"time": "06:45:43"
},
"pathParameters": {
View parameter.yml
Parameters:
DataBucketName:
Type: String
Description: S3 bucket where CSV files are placed
Default: your-data-bucket-name-here
View build-package-deploy.sh
# variables
S3_BUILD_BUCKET=your_build_bucket_name
STACK_NAME=your_cloudformation_stack_name
# validate
sam validate --template template.yaml
aws cloudformation validate-template \
--template-body file://template.yaml
View app.js
exports.getMessage = async (event, context) => {
if (tableName == null) {
tableName = process.env.TABLE_NAME;
}
params = {
TableName: tableName,
Key: {
"date": event.pathParameters.date,
"time": event.queryStringParameters.time
}
View app.py
def lambda_handler(event, context):
operations = {
'DELETE': lambda dynamo, x: dynamo.delete_item(**x),
'POST': lambda dynamo, x: dynamo.put_item(**x),
'PUT': lambda dynamo, x: dynamo.update_item(**x),
'GET': lambda dynamo, x: dynamo.get_item(**x),
'GET_ALL': lambda dynamo, x: dynamo.scan(**x),
}
for record in event['Records']:
View app.py
def lambda_handler(event, context):
bucket = event['Records'][0]['s3']['bucket']['name']
key = urllib.parse.unquote_plus(
event['Records'][0]['s3']['object']['key'],
encoding='utf-8'
)
messages = read_csv_file(bucket, key)
process_messages(messages)
View sqs_message.json
{
"TableName": "your-dynamodb-table-name",
"Item": {
"date": {
"S": "2001-01-01"
},
"time": {
"S": "09:01:05"
},
"location": {
View message_data.csv
timestamp location source local_dest local_avg remote_dest remote_avg
1559040909.3853335 location-03 wireless router-1 4.39 device-1 9.09
1559040919.5273902 location-03 wireless router-1 0.49 device-1 16.75
1559040929.6446512 location-03 wireless router-1 0.56 device-1 8.31
1559040939.7712135 location-03 wireless router-1 1.64 device-1 9.4
1559040949.891723 location-03 wireless router-1 1.18 device-1 9.07
1559040960.011338 location-03 wireless router-1 0.42 device-1 8.4
1559040970.1319716 location-03 wireless router-1 1.73 device-1 8.66
1559040980.2533505 location-03 wireless router-1 0.67 device-1 8.61
1559040990.3816211 location-03 wireless router-1 1.27 device-1 10.87
View gimp_scale_batch.py
#!/usr/bin/python
# optimized for processing screen grabs
# copy and paste into GIMP Python Console
# open all images in Gimp to process
# script scales, sharpens, and exports all open images as PNGs
# Filters > Python-Fu > Console
import importlib
You can’t perform that action at this time.