Skip to content

Instantly share code, notes, and snippets.

@ppshein
ppshein / aws-s3-multer-nodejs.js
Last active May 15, 2019 11:19
Upload files to AWS-S3 with NodeJS
var aws = require('aws-sdk')
var express = require('express')
var multer = require('multer')
var multerS3 = require('multer-s3')
var app = express()
var s3 = new aws.S3({
accessKeyId: '',
secretAccessKey: '',
region: 'ap-southeast-1'
@ppshein
ppshein / Dockerfile
Last active June 6, 2019 07:37
SailsJS Docker Image
# Instructions from the app developer
# - you should use the 'node' official image, with the alpine 6.x branch
FROM node:latest
# - this app listens on port 3000, but the container should launch on port 80
# so it will respond to http://localhost:80 on your computer
EXPOSE 1337
# - then it should use alpine package manager to install tini: 'apk add --update tini'
# RUN apk add --update tini
# - then it should create directory /usr/src/app for app files with 'mkdir -p /usr/src/app'
RUN mkdir -p /usr/src/app
@ppshein
ppshein / setup-awscli-codedeploy-agent.sh
Created June 6, 2019 09:50 — forked from craigvantonder/setup-awscli-codedeploy-agent.sh
How to install AWS Code Deploy agent in Ubuntu 14.04 / 16.04 / 18.04
#!/bin/bash
# AWS CLI
apt-get install zip -y;
curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip";
unzip awscli-bundle.zip;
./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws;
aws configure;
#AWS Access Key ID [None]: Obtained when creating user in AWS IAM
@ppshein
ppshein / aws-ship-it-stack.yml
Created June 10, 2019 14:34 — forked from TheDeveloper/aws-ship-it-stack.yml
App deploy stack with ECS, CodeBuild & CodePipeline.
# App ship-it stack with ECS, CodeBuild & CodePipeline.
#
# aws cloudformation deploy \
# --stack-name myapp-prod \
# --template-file ./aws-ship-it-stack.yaml \
# --parameter-overrides \
# KeyName=<KEY_NAME> \
# GitHubAuthToken=<ACCESS_TOKEN> \
# RepoOwner=<OWNER_NAME> \
# RepoName=<REPO_NAME> \
"use strict";
const MongoClient = require('mongodb').MongoClient;
const MONGODB_URI = process.env.MONGODB_URI;
let cachedDb = null;
function connectToDatabase (uri) {
if (cachedDb) {
return Promise.resolve(cachedDb);
var AWS = require("aws-sdk");
var lambda = new AWS.Lambda({
region: 'ap-southeast-1'
});
function mongodbMonitorResult() {
return new Promise(function(resolve, reject) {
lambda.invoke({
FunctionName: process.env.MongoMonitorLambdaARN,
@ppshein
ppshein / main.tf
Created October 21, 2019 01:23 — forked from carlochess/main.tf
aws batch terraform example
## Make sure your Subnet has internet access
variable "subnet" {}
variable "vpc" {}
provider "aws" {
region = "us-east-1"
}
data "aws_vpc" "sample" {
id = "${var.vpc}"
@ppshein
ppshein / multipart.js
Created October 27, 2019 03:16 — forked from magegu/multipart.js
mutipart upload for aws s3 with nodejs based on the async lib including retries for part uploads
/*
by Martin Güther @magegu
just call it:
uploadFile(absoluteFilePath, callback);
*/
var path = require('path');
var async = require('async');
@ppshein
ppshein / elasticsearchbulkimport.js
Created November 13, 2019 04:05 — forked from nicholasblexrud/elasticsearchbulkimport.js
Bulk upload files using Node.js to Elasticsearch
// credit goes to this stack overflow post - http://stackoverflow.com/questions/20646836/is-there-any-way-to-import-a-json-filecontains-100-documents-in-elasticsearch
var elasticsearch = require('elasticsearch'),
fs = require('fs'),
pubs = JSON.parse(fs.readFileSync(__dirname + '/pubs.json')), // name of my first file to parse
forms = JSON.parse(fs.readFileSync(__dirname + '/forms.json')); // and the second set
var client = new elasticsearch.Client({ // default is fine for me, change as you see fit
host: 'localhost:9200',
log: 'trace'
});
@ppshein
ppshein / AppSync-Example.yaml
Created January 17, 2020 09:12 — forked from adrianhall/AppSync-Example.yaml
An example CloudFormation template for AWS AppSync
---
Description: AWSAppSync DynamoDB Example
Resources:
GraphQLApi:
Type: "AWS::AppSync::GraphQLApi"
Properties:
Name: AWSAppSync DynamoDB Example
AuthenticationType: AWS_IAM
PostDynamoDBTableDataSource: