Skip to content

Instantly share code, notes, and snippets.

David Guttman davidguttman

Block or report user

Report or block davidguttman

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
dimkir / nightmare-on-amazon-linux.MD
Last active Oct 2, 2019
How to run nightmare on Amazon Linux
View nightmare-on-amazon-linux.MD

Running nightmare on Amazon Linux

You may have thought of running nightmare on AWS Lambda. But before we can run it on Lambda, we need first to make it run on Amazon Linux.

Provision instance which replicates Lambda environment

According to AWS Documentation on Lambda Execution Environment and available Libraries we would need this AMI image with this alias amzn-ami-hvm-2016.03.3.x86_64-gp2. Keep in mind that AMI-image-id for this instance would be different in different regions (eg):

  • In eu-west-1 - ami-f9dd458a
  • In us-east-1 - ami-6869aa05
dotproto / unicode_string_comparison.js
Last active Jan 20, 2017
Examining raw unicode values and their normalized forms. TL:DR; comparing unicode strings using a `.normalized()` and `. localeCompare()`
View unicode_string_comparison.js
// References
// -
// -
// - (What is the difference is between W3C normalization and Unicode normalization?)
// -
// Resources
// -
beaugunderson /
Last active Aug 29, 2016
cool modules from nodeconf

from streams session

  • end-of-stream - specify a callback to be called when a stream ends (which is surpsingly hard to get right)
  • duplexify - compose a Duplex stream from a Readable and a Writable stream
  • pump - pipe streams together and close all of them if one of them closes
  • pumpify - combine an array of streams into a single duplex stream using pump and duplexify
  • through2 - tools for making Transform streams
  • from2 - tools for making Readable streams

from "participatory modules" session

chrismdp /
Last active Sep 27, 2019
Uploading to S3 in 18 lines of Shell (used to upload builds for
# You don't need Fog in Ruby or some other library to upload to S3 -- shell works perfectly fine
# This is how I upload my new Sol Trader builds (
# Based on a modified script from here:
S3KEY="my aws key"
S3SECRET="my aws secret" # pass these in
function putS3
maxogden / index.js
Last active Aug 29, 2015
node tcp + http speed test
View index.js
var net = require('net')
var through = require('through2')
var server = process.argv[2]
var path = process.argv[3]
var host = process.argv[4]
var socket = net.connect(80, server)
var req = ["GET " + path + " HTTP/1.1\r\n",
staltz /
Last active Oct 16, 2019
The introduction to Reactive Programming you've been missing
jdxcode / boot.js
Last active Oct 24, 2018
zero-downtime node.js app runner
View boot.js
// This script will boot app.js with the number of workers
// specified in WORKER_COUNT.
// The master will respond to SIGHUP, which will trigger
// restarting all the workers and reloading the app.
var cluster = require('cluster');
var workerCount = process.env.WORKER_COUNT || 2;
// Defines what each worker needs to run
substack / par_server.js
Created Mar 30, 2014
co + thunkify http servers
View par_server.js
// parallel steps
var co = require('co');
var http = require('http');
var fs = require('fs');
var thunkify = require('thunkify');
var server = http.createServer(co(function *(req, res) {
var files = yield {
a: thunkify(fs.readFile)('a.txt'),
substack / browser.js
Last active Jan 3, 2016
multilevel example setup
View browser.js
var multilevel = require('multilevel');
var shoe = require('shoe');
var sock = shoe('/sock');
var db = multilevel.client();
window.db = db;
You can’t perform that action at this time.