Skip to content

Instantly share code, notes, and snippets.

View heapwolf's full-sized avatar
🕷️
Vault of the Black Spiders

heapwolf heapwolf

🕷️
Vault of the Black Spiders
View GitHub Profile

Synopsis

In Pull-Streams, there are two fundamental types of streams Sources and Sinks. There are two composite types of streams Through (aka transform) and Duplex. A Through Stream is a sink stream that reads what goes into the Source Stream, it can also be written to. A duplex stream is a pair of streams ({Source, Sink}) streams.

Pull-Streams

Source Streams

A Source Stream (aka readable stream) is a async function that may be called repeatedly until it returns a terminal state. Pull-streams have back pressure, but it implicit instead of sending an explicit back pressure signal. If a source needs the sink to slow down, it may delay returning a read. If a sink needs the source to slow down, it just waits until it reads the source again.

@heapwolf
heapwolf / levelup.md
Last active August 29, 2015 13:57 — forked from dshaw/levelup.md

img

build

Clone and build Node for analysis:

$ git clone https://github.com/joyent/node.git
$ cd node
$ export GYP_DEFINES="v8_enable_disassembler=1 v8_object_print=1"
$ export CXXFLAGS="-fno-omit-frame-pointer"
$ ./configure
@heapwolf
heapwolf / index.js
Created September 14, 2013 11:34 — forked from max-mapper/index.js
// data comes from here http://stat-computing.org/dataexpo/2009/the-data.html
// download 1994.csv.bz2 and unpack by running: cat 1994.csv.bz2 | bzip2 -d > 1994.csv
// 1994.csv should be ~5.2 million lines and 500MB
// importing all rows into leveldb took ~50 seconds on my machine
// there are two main techniques at work here:
// 1: never create JS objects, leave the data as binary the entire time (binary-split does this)
// 2: group lines into 16 MB batches, to take advantage of leveldbs batch API (byte-stream does this)
var level = require('level')
var request = require('request')
request.get('http://api.twitter.com/1/users/show.json?screen_name=mikeal', {json:true}, function (e, r, doc) {
console.log(doc)
})
request.get('https://api.twitter.com/1/statuses/user_timeline.json?include_entities=true&include_rts=true&screen_name=mikeal', {json:true}, function (e, r, doc) {
console.log(doc)
})
@heapwolf
heapwolf / install-nodejs.sh
Created August 8, 2012 18:43 — forked from TooTallNate/install-nodejs.sh
Simple Node.js installation script using the precompiled binary tarballs
#!/bin/sh
VERSION=0.8.6
PLATFORM=darwin
ARCH=x64
PREFIX="$HOME/node-v$VERSION-$PLATFORM-$ARCH"
mkdir -p "$PREFIX" && \
curl http://nodejs.org/dist/v$VERSION/node-v$VERSION-$PLATFORM-$ARCH.tar.gz \
| tar xzvf - --strip-components=1 -C "$PREFIX"
@heapwolf
heapwolf / gist:3086430
Created July 10, 2012 21:47 — forked from rmurphey/gist:3086328
What's wrong with Netmag's "Optimize your JavaScript" post

What's wrong with Netmag's "Optimize your JavaScript" post

I tweeted earlier that this should be retracted. Generally, these performance-related articles are essentially little more than linkbait -- there are perhaps an infinite number of things you should do to improve a page's performance before worrying about the purported perf hit of multiplication vs. division -- but this post went further than most in this genre: it offered patently inaccurate and misleading advice.

Here are a few examples, assembled by some people who actually know what they're talking about (largely Rick Waldron and Ben Alman, with some help from myself and several others from the place that shall be unnamed).

Things that are just plain wrong

  • Calling array.push() five times in a row will never be a "performance improvement." The author has clearly co
@heapwolf
heapwolf / gist:1856712
Created February 18, 2012 01:17
Plates
data = [ { _id: 'first',
name: 'My first post',
title: 'first',
content: 'This is my first post',
ctime: 1329501275682,
mtime: 1329501275682,
resource: 'Post',
_rev: '1-c045632b8020ed83d84210f2bfe8eac5' },
{ _id: 'second',
name: 'My second post',
var util = require('util'),
Stream = require('stream');
var FastJSONStream = exports.FastJSONStream = function (options) {
this.bufferSize = options.bufferSize;
this._buffer = new Buffer(this.bufferSize);
};
util.inherits(FastJSONStream, Stream);
FastJSONStream.prototype.write = function (chunk) {
@heapwolf
heapwolf / jsonparse.js
Created February 13, 2012 23:27 — forked from creationix/jsonparse.js
Sax-only version of jsonparse
// Named constants with unique integer values
var C = {};
// Tokenizer States
var START = C.START = 0x11;
var TRUE1 = C.TRUE1 = 0x21;
var TRUE2 = C.TRUE2 = 0x22;
var TRUE3 = C.TRUE3 = 0x23;
var FALSE1 = C.FALSE1 = 0x31;
var FALSE2 = C.FALSE2 = 0x32;
var FALSE3 = C.FALSE3 = 0x33;