Skip to content

Instantly share code, notes, and snippets.

@nvasilakis
Last active August 7, 2017 12:00
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save nvasilakis/10904b2628e87f68bdd180dcdf64c7a7 to your computer and use it in GitHub Desktop.
Save nvasilakis/10904b2628e87f68bdd180dcdf64c7a7 to your computer and use it in GitHub Desktop.
PLOS submission #1 online appendix
node_modules/*

This repository accompanies PLOS2017 submission #1 and contains additional data and scripts to reproduce all the results presented in the paper.

You can clone this repository using git clone https://gist.github.com/10904b2628e87f68bdd180dcdf64c7a7.git.

Appendix

./appendix.md -- appendices with hardware used to run experiments, more application characteristics etc.

Code

For some of these experiments you might need to increase the number of file descriptors and number of processes beyond 5K. ulimit -a gives you current limits.

Repository Statistics

  • npm-stats.sh -- fetch npm metadata database and process them

Application Characteristics

  • apps.sh -- download applications of Table 3, place them in a directory structure expected by stats.sh
  • stats.sh -- given a group of applications, collect dependency and code characteristics (uses stats.js)
  • stats.js -- report on the dependency tree characteristics

You can also point stats.sh and stats.js to any other npm repository of your choice to get similar results.

Compartment creation

  • ./cost.js -- creates a MAX_NUM of compartments (edit line 6 to change it, e.g., MAX_NUM = 5000 for 5K compartments) using various different ways and reports on the results (generates markdown table). You might want to comment out portions of the file because it will generate too much load on the system (i.e., 5K processes async AND 5K processes sync AND 5K process..). Set the report() function wherever you want the report to be generated, but watch for asynchronous code (that might reach the report statement before doing everything).

IPC performance

These require an input file of large size (for copying/sending). You can run dd to create one and (ideally) place it into memory: e.g., for a 1GB file, run dd if=/dev/zero of=1GB.txt bs=10M count=100.

Function Streams:

  • ./boundaries.js -- for function pipeline overheads. You need to edit this file and replicate line .pipe(new utils.Identity({once: true})) as many times as the boundaries. For large numbers of stages you need to raise the stack limit to avoid stack overflow; you can do this by passing --stack_size=X where X is the default size of stack region v8 is allowed to use (in kBytes) (e.g., node --stack-size=100000 ./boundaries.js for 100MB)

TCP Streams:

  • ./config.js -- holds the experiment configuration. Edit this file as follows: number holds the number of compartments (e.g., number: 5000), input the location of your input file (e.g., input: '/dev/shm/1GB.txt', and ipc the type of IPC (e.g., ipc: ipc.TCP).

  • ./cmpt.js -- spawns compartments, hooks them together and streams data between them. When it reports timings of the last stage you can kill it to process results.

  • ./parseResults.js -- process results (results.txt) and generate a report.

UDP Streams:

  • similar with TCP -- just edit ./config.js ipc parameter to ipc.UDS.

Appendix

Summary of Hardware Specs

  • Processor: 8x E7-8860 (10c) with Intel HT (160 virtual cores) @ 2.27GHz
  • Memory: 512GB 1067MHz DDR3
  • Disk: /dev/shm/ (memory mapped fs)
  • OS: Linux 3.16.0-4-amd64 #1 SMP Debian 3.16.39-1 (2016-12-30) x86_64 GNU/Linux

See later for more detailed hardware characteristics

Application Characteristics

Application Direct Total Files Depth DLoC MLOC TLoC/File
commands
cash 15 84 3554 5 1486 48540 13.84
eslint 34 135 4689 6 187801 74893 39.97
yo 30 301 5829 6 107713 106393 18.45
desktop
popcorn 46 765 34322 10 14304 411706 12.34
twitter 10 120 4051 8 2514 165066 41.29
atom 57 358 5252 9 15939 548642 107.1
mobile
hackernews 5 871 49406 10 309 317144 6.42
matttermost 17 521 13672 8 6296 286388 21.37
stockmarket 14 44 1985 5 2440 199119 101.48
server
express 26 42 217 3 10159 2261 54.93
ghost 62 981 22029 9 42467 386676 19.35
strider 64 659 10357 8 21090 303527 30.41
utility
chalk 3 4 9 2 217 10 18.44
natural 3 3 193 1 12483 4116 81.51
winston 6 6 83 1 4274 2326 79.52
average 26.13 326.27 10376.53 6.07 28632 190453.8 43.09

Costs

Livestar server (hw stats later)

InProc FS V8sbx Proc
5 0.22 1.82 6.27 398.08
50 0.23 16.27 48.21 3874.94
500 0.41 155.90 527.25 39151.33
5K 1.30 1217.49 5564.17 408766.03

Parallel 5K: 23938.59

functions

  • 5: 2.599508495
  • 50: 3.182350395
  • 500: 10.741855539
  • 5K: 84.75154958

UDS crossings: 5

  • startLatency: 17.831921 ms
  • endLatency: 73.834058 ms
  • end-to-end: 3.342693252 s
  • experiment: /dev/shm/500MB.txt:uds

UDS crossings: 50

  • startLatency: 244.587076 ms
  • endLatency: 536.685912 ms
  • end-to-end: 3.935779212 s
  • experiment: /dev/shm/500MB.txt:uds

UDS crossings: 500

  • startLatency: 3.713686854 s
  • endLatency: 11.958660013 s
  • end-to-end: 30.312946409 s
  • experiment: /dev/shm/500MB.txt:uds

TCP crossings: 5

  • startLatency: 17.771571 ms
  • endLatency: 36.666609 ms
  • end-to-end: 3.161545113 s
  • experiment: /dev/shm/500MB.txt:tcp

TCP crossings: 50

  • startLatency: 210.327483 ms
  • endLatency: 566.822015 ms
  • end-to-end: 3.731278853 s
  • experiment: /dev/shm/500MB.txt:tcp

TCP crossings: 500

  • startLatency: 6.536603478 s
  • endLatency: 15.655204252 s
  • end-to-end: 23.879532933 s
  • experiment: /dev/shm/500MB.txt:tcp

Macbook 2015

InProc FS V8sbx Proc
5 0.43 4.35 12.99 342.58
50 0.40 30.28 76.62 3297.59
500 0.51 136.44 524.74 35213.20
5K 1.02 1790.66 7854.45 362466.07

Parallel 500: 15403.56ms.

functions:

  • 5: 1.969190431
  • 50: 2.563155544
  • 500: 8.861439791

crossings: 5

  • startLatency: 12.978937 ms
  • endLatency: 29.250236 ms
  • end-to-end: 10.338634148 s
  • experiment: ./500MB.txt:uds

crossings: 50

  • startLatency: 224.508678 ms
  • endLatency: 451.871243 ms
  • end-to-end: 1.7833333333333334 min
  • experiment: ./500MB.txt:uds

crossings: 500

  • hanged (spining cursor) after 10 minutes of sweat and tears (only 240M copied)

System Characteristics

return of lscpu

Architecture:          x86_64
CPU op-mode(s):        32-bit, 64-bit
Byte Order:            Little Endian
CPU(s):                160
On-line CPU(s) list:   0-159
Thread(s) per core:    2
Core(s) per socket:    10
Socket(s):             8
NUMA node(s):          8
Vendor ID:             GenuineIntel
CPU family:            6
Model:                 47
Model name:            Intel(R) Xeon(R) CPU E7- 8860  @ 2.27GHz
Stepping:              2
CPU MHz:               1064.000
CPU max MHz:           2262.0000
CPU min MHz:           1064.0000
BogoMIPS:              4533.57
Virtualization:        VT-x
L1d cache:             32K
L1i cache:             32K
L2 cache:              256K
L3 cache:              24576K
NUMA node0 CPU(s):     1-10,41-50
NUMA node1 CPU(s):     11-20,51-60
NUMA node2 CPU(s):     21-30,61-70
NUMA node3 CPU(s):     31-40,71-80
NUMA node4 CPU(s):     0,81-89,120-129
NUMA node5 CPU(s):     90-99,130-139
NUMA node6 CPU(s):     100-109,140-149
NUMA node7 CPU(s):     110-119,150-159
Flags:                 fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc aperfmperf pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic popcnt aes lahf_lm ida arat epb dtherm tpr_shadow vnmi flexpriority ept vpid

return of uname -a

Linux livestar 3.16.0-4-amd64 #1 SMP Debian 3.16.39-1 (2016-12-30) x86_64 GNU/Linux

return of cat /proc/meminfo

MemTotal:       529202828 kB
MemFree:        258991376 kB
MemAvailable:   429298608 kB
Buffers:          239920 kB
Cached:         164687024 kB
SwapCached:         6000 kB
Active:         94882984 kB
Inactive:       166903512 kB
Active(anon):   94536368 kB
Inactive(anon):  2859420 kB
Active(file):     346616 kB
Inactive(file): 164044092 kB
Unevictable:           0 kB
Mlocked:               0 kB
SwapTotal:      1079412020 kB
SwapFree:       1079255192 kB
Dirty:                 0 kB
Writeback:             0 kB
AnonPages:      96854204 kB
Mapped:            34112 kB
Shmem:            535916 kB
Slab:            6899656 kB
SReclaimable:    6761264 kB
SUnreclaim:       138392 kB
KernelStack:       31856 kB
PageTables:       213892 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:    1344013432 kB
Committed_AS:   95840236 kB
VmallocTotal:   34359738367 kB
VmallocUsed:     1429460 kB
VmallocChunk:   33888085164 kB
HardwareCorrupted:     0 kB
AnonHugePages:         0 kB
HugePages_Total:       0
HugePages_Free:        0
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       2048 kB
DirectMap4k:      243296 kB
DirectMap2M:    18397184 kB
DirectMap1G:    517996544 kB

return of node -e 'console.log(process.versions)'

{ http_parser: '2.7.0',
  node: '6.9.4',
  v8: '5.1.281.89',
  uv: '1.9.1',
  zlib: '1.2.8',
  ares: '1.10.1-DEV',
  icu: '56.1',
  modules: '48',
  openssl: '1.0.2j' }
# collect dependency statistics for five classes of applications
# applications from https://github.com/sindresorhus/awesome-nodejs
# Switched visual studio to atom because visual studio requires a very
# tailored installation process. atom requires misses a `libxkbfile-dev`
# dependency on debian. To pinpoint other dependencies run apt-file:
# $ sudo apt-get install apt-file
# $ sudo apt-file update
# $ apt-file search "X11/extensions/XTest.h"
# libxtst-dev: /usr/include/X11/extensions/XTest.h
mkdir apps; cd apps;
mkdir commands; cd commands;
git clone https://github.com/dthree/cash
git clone git@github.com:eslint/eslint.git
git clone https://github.com/yeoman/yo
# git clone https://github.com/ternjs/acorn
# git clone git@github.com:babel/babel.git
cd ..; mkdir desktop; cd desktop;
git clone git@github.com:popcorn-official/popcorn-desktop.git
mv popcorn-desktop popcorn
#git clone git@github.com:Microsoft/vscode.git
#mv vscode visualstudio
git clone git@github.com:k0kubun/Nocturn.git
mv Nocturn twitter
git clone git@github.com:atom/atom.git
# git clone git@github.com:TryGhost/Ghost-Desktop.git
# git clone git@github.com:zeit/hyper.git
# git clone git@github.com:mattermost/desktop.git
# git clone https://github.com/slate/slate
cd ..; mkdir mobile; cd mobile;
git clone git@github.com:mattermost/mattermost-mobile.git
mv mattermost-mobile matttermost
git clone git@github.com:iSimar/HackerNews-React-Native.git
mv HackerNews-React-Native hackernews
git clone git@github.com:ionic2blueprints/ionic2-stockmarket.git
mv ionic2-stockmarket stockmarket
# git clone git@github.com:catalinmiron/react-native-dribbble-app.git
# git clone git@github.com:aggarwalankush/ionic2-mosum.git
# git clone git@github.com:smartapant/ionic2-reddit-reader.git
# git clone git@github.com:ionic2blueprints/ionic2-wp-client.git
cd ..; mkdir server; cd server;
git clone git@github.com:expressjs/express.git
git clone git://github.com/tryghost/ghost.git
git clone git@github.com:Strider-CD/strider.git
# git clone git@github.com:uber/image-diff.git
# git clone git@github.com:keystonejs/keystone.git
# git clone https://github.com/substack/node-mkdirp
# git clone git@github.com:Unitech/pm2.git
cd ..; mkdir utility; cd utility;
git clone git@github.com:NaturalNode/natural.git
git clone https://github.com/winstonjs/winston
git clone https://github.com/chalk/chalk
# git clone https://github.com/facebook/immutable-js
# git clone https://github.com/lodash/lodash/
cd ..
#!env node
var zlib = require("zlib");
var fs = require('fs');
var utils = require('./utils.js');
var config = require('./config.js');
var options;
var toParent = null;
try {
options = JSON.parse(process.argv[2]);
} catch (e) {
options = {};
}
options = utils.merge(config.defaults, options);
options.output = options.output || options.input + '_OUTPUT';
// max: 650-700
// if problem with maxListeners, `stream.setListeners(0)`
fs.createReadStream(options.input)
//.pipe(zlib.createGunzip())
.pipe(new utils.Identity({once: true}))
.pipe(new utils.Identity({once: true}))
.pipe(new utils.Identity({once: true}))
.pipe(new utils.Identity({once: true}))
.pipe(new utils.Identity({once: true}))
.pipe(fs.createWriteStream(options.output));
//function generate(output) {
// var rs = Readable();
//
// var c = 97;
// rs._read = function () {
// rs.push('a');
// count++;
// if (count === SIZE) {
// rs.push('o');
// console.log('exiting');
// setTimeout(function () {
// process.exit(0);
// }, 1000);
// }
// };
// rs.pipe(output);
//}
//
//generate(process.stdout);
#!env node
/*
* Before running this script, generate some input files
* (for bs values linux accepts 'M', OS X 'm'):
* -- dd if=/dev/zero of=1GB.txt bs=10M count=100
* -- dd if=/dev/zero of=500MB.txt bs=10M count=50
* -- dd if=/dev/zero of=100MB.txt bs=10M count=10
* -- dd if=/dev/zero of=10MB.txt bs=1M count=10
* -- dd if=/dev/zero of=1MB.txt bs=1M count=1
*
* TODO: I'm wondering if it makes sense to read directly from the /dev/zero and keep a bytesread counter
*
* since asynchronous, pipeline is created in "reverse order"-- that is,
* from consumer to generator:
* consumer3 (parent of consumer 2) <- consumer2 (parent of consumer1) <- consumer1 (i.e., parent of generator) <- generator
*
* if no 'parent' found, it starts as standalone
*/
var childProc = require('child_process');
var utils = require('./utils.js');
var config = require('./config.js');
var net = require('net');
var dgram = require("dgram");
var udp = require('datagram-stream');
var fs = require('fs');
var log = utils.log;
var options;
// THIS IS GLOBAL
var toParent = null;
try {
options = JSON.parse(process.argv[2]);
} catch (e) {
options = {};
}
options = utils.merge(config.defaults, options);
options.output = options.output || options.input + '_OUTPUT';
// FIXME if as child this script is connected using pipes,
// this will end up in the parent (input) stream
console.log(options)
if (!options.parent) {
utils.logAsync('experiment: file ' + options.input + ':' + options.ipc);
}
utils.logAsync(options.number + "\n");
var stages = [];
var functionCalls = function (inputStream) {
var chanOpts = {port: options.port, number: options.number, tag: ''};
if (options.number <= 1) {
chanOpts.tag += 'generator';
} else if (!options.parent) {
chanOpts.tag += 'consumer';
}
//for (var i = 0; i < 300; i++) {
// stages[i] = new utils.Identity(chanOpts);
// stages[i].setMaxListeners(0);
// inputStream.pipe(stages[i]);
//};
var outputStream = fs.createWriteStream(options.output);
outputStream.setMaxListeners(0);
inputStream.pipe(outputStream);
}
var getSocketName = function (port) {
return port + '.sock';
}
var createCleanup = function () {
// Cleanup stuff
var cleanupStarted = false;
var cleanup = function (num) {
if (cleanupStarted) {
return
}
cleanupStarted = true;
num = num || 0;
console.log('cleaning up..');
// remove sockets
if (options.ipc === config.UDS) {
console.log('= ' + getSocketName(options.port));
fs.unlinkSync(getSocketName(options.port));
}
};
// do app specific cleaning before exiting
process.on('exit', function (e) {
cleanup();
process.exit(e);
});
process.on('uncaughtException', function(e) {
console.log('Uncaught Exception...');
console.log(e.stack);
cleanup();
process.exit();
});
}
var generateChild = function () {
console.log('generating child');
// after everything is set up
var childArgs = utils.clone(options);
childArgs.parent = {
pid: process.pid,
port: options.port
}
childArgs.number--;
childArgs.port++;
//var cArgs = "'" + JSON.stringify(childArgs) + "'";
var cArgs = JSON.stringify(childArgs);
var args = [__filename, cArgs];
console.log(args);
var child = childProc.spawn('node', args);
}
var hookGeneration = function () {
//log('\nI am process ' + process.pid + ' with socket ' + toParent + 'with "port" ' + options.port + ' and number ' + options.number + 'and parent port ' + (options.parent? options.parent.port : 'root'));
// if last-created (i.e., generator)
//console.log("optionsToParent: " + (toParent? "TRUE" : "FALSE") + " options.parent " + options.parent);
var chanOpts = {port: options.port, number: options.number, tag: ''};
if (options.number <= 1) {
chanOpts.tag += 'generator';
} else if (!options.parent) {
chanOpts.tag += 'consumer';
}
utils.logAsync(options.port + " sending to " + (options.parent? options.parent.port : "fs"))
if (options.number <= 1) {
if (options.ipc === config.ipc.UDP) {
var stream = udp({
address: options.address,
unicast: (options.parent? (options.parent.address || options.address) : options.address),
port: (options.parent? options.parent.port : 1234), // port to send
bindingPort : options.port,
reuseAddr : true,
});
fs.createReadStream(options.input)
.pipe(new utils.Identity(chanOpts))
.pipe(options.parent? stream : fs.createWriteStream(options.output))
return;
}
//log('\n hook gen')
// simply hook generation
fs.createReadStream(options.input)
//.pipe(zlib.createGunzip())
//.pipe(new T(id))
.pipe(new utils.Identity(chanOpts))
.pipe(options.parent? toParent :
fs.createWriteStream(options.output))
} else {
if (options.ipc === config.ipc.UDP) {
var stream = udp({
address: options.address,
unicast: options.parent? (options.parent.address || options.address) : options.address,
port: options.parent? options.parent.port : (1234 + options.number), // port to send
bindingPort: options.port,
reuseAddr: true,
}, generateChild);
utils.logAsync(options.number + ":" + options.port + "\n")
// pipe whatever is received through Identity stream (that adds stats)
// and then either to the parent stream or to output file
stream.pipe(new utils.Identity(chanOpts))
.pipe(options.parent? stream : fs.createWriteStream(options.output));
return;
}
// top-level: pipes to stats
var server = net.createServer(function(socket) {
// FIXME: this stringify is problematic!!
//log('\n port ' + options.port + ' socket ' + utils.stringify(toParent))
// pipe to parent
console.log('.')
socket.pipe(new utils.Identity(chanOpts))
.pipe(options.parent? toParent :
fs.createWriteStream(options.output));
});
server.once('listening', function () {
generateChild()
});
if (options.ipc === config.ipc.UDS) {
createCleanup();
server.listen(getSocketName(options.port));
} else if (options.ipc === config.ipc.TCP) {
server.listen(options.port, options.address);
}
}
}
var keepEventLoopRunning = function () {
var server = net.createServer(function(socket) {
socket.write('Echo server\r\n');
socket.pipe(socket);
});
server.listen((8080 + options.number), '0.0.0.0');
}
if (options.ipc === config.ipc.POINTER) {
var stream = fs.createReadStream(options.input);
stream.setMaxListeners(0);
functionCalls(stream);
keepEventLoopRunning();
} else {
// if we have parent, we first establish connection with them,
// then create a server that pipes there, and finally create a
// child process; o/w we simply create a server that reports stats
// upon completion and then a child
if (options.parent) {
if (options.ipc === config.ipc.UDP) {
// no need to connect
hookGeneration();
keepEventLoopRunning();
return;
}
toParent = new net.Socket();
if (options.ipc === config.ipc.UDS) {
toParent.connect(getSocketName(options.parent.port), hookGeneration);
} else {
toParent.connect(options.parent.port, options.parent.address || options.address, hookGeneration);
}
} else {
hookGeneration();
}
}
var utils = require('./utils.js');
var types = utils.toEnum(['process', 'sfi', 'container', 'vm']);
var ipc = utils.toEnum(['pointer', 'fifo', 'uds', 'udp', 'tcp']);
var defaults = {
debug: 0
, address: "0.0.0.0"
, defaultSocket: "*.sock"
, data: "generate"
, port: 8080
, type: types.PROCESS
, ipc: ipc.UDS
, instantiate: false
, onFail: function (e) {console.error(e)}
, minTime: 0
// number of compartments
, number: 500
, input: '/dev/shm/500MB.txt'
, allowed: /.*/
, compose: false
, replicate: false
}
module.exports = {
types: types
, ipc: ipc
, defaults: defaults
}
/*
Need to add some warmup
MAKE NAMES OF EXPERIMENTS CONSISTENT WITH EACH OTHER!
*/
var MAX_NUM = 5; //Math.pow(10, 9);
var PADDING = 10;
console.log('load mods')
var t = setTimeout(function () {}, 3600 * 1000);
var fs = require('fs');
var net = require('net');
var readline = require('readline');
var path = require('path');
var childProc = require('child_process');
var utils = require('./utils.js');
var code = "var x = ";
var stats = {};
var id = 0;
// this timer value will be discarded soon
var timer = process.hrtime();
var discard = [];
var mDir="/dev/shm/ba_modules/";
console.log('reports')
var report = function () {
console.log('Compartments ', MAX_NUM);
var header = "| ";
var values = "| ";
for (var k in stats) {
header += utils.padString(k, PADDING) + "|";
var n = utils.toMillis(stats[k]).toFixed(2);
values += utils.padString(n.toString(), PADDING) + "|";
//console.log(k, stats[k], utils.hrToMicros(stats[k]));
};
console.log(header);
console.log(values);
process.exit();
}
// there are three types:
// - m, for normal modules
// - p, for processes that setTimeout
var getName = function (id, type) {
type = type || 'm';
return type + '-' + id + '.js';
}
var getSocket = function (id) {
return path.join(mDir, (id + '.sock'));
}
var getPath = function (i, o) {
var p = path.join(mDir, getName(i, o));
//console.log("trying to spawn ", p);
return p;
}
var createSocket = function (id) {
var content = readModInMod(id);
content += "\nvar net = require('net');\n";
content += "\nvar server = net.createServer(function(socket) {\n";
content += " socket.write('Echo server\\r\\n');\n";
content += " socket.pipe(socket);\n";
content += "});\n";
content += "server.listen('./" + id + ".sock" + "');\n";
content += addTimeout();
return content;
}
var addTimeout = function () {
var l = (function () {}).toString();
var t = 3600 * 1000; //1hour
return "\nt = setTimeout(" + l + ", " + t + ");\n";
}
var sendMessage = function (id) {
var content = readModInMod(id);
content += "\nprocess.send({status: 'spawn'});";
content += addTimeout();
return content;
}
var keepOn = function (id) {
var content = readModInMod(id);
content += "\nprocess.kill(" + process.pid + ", 'SIGUSR2');";
content += addTimeout();
return content;
}
var readModInMod = function (id) {
var content = "var mod = require('./" + getName(id) + "');";
content += "\n\nmodule.exports = mod;";
return content;
}
// Clean Benchmark dir
if (!fs.existsSync(mDir)) {
fs.mkdirSync(mDir);
} else {
fs.readdirSync(mDir).forEach(function(file,index){
//console.log(path.join(mDir, file));
fs.unlinkSync(path.join(mDir, file));
});
}
var modules = [];
for (id = 0; id < MAX_NUM; id++) {
if (id % 100 === 0) {
console.log(id);
}
// ..modules
fs.writeFileSync(path.join(mDir, getName(id, 'm')),
utils.storeOn(id), 'utf8');
// ..module compartments (simulated)
fs.writeFileSync(path.join(mDir, getName(id, 'p')),
readModInMod(id), 'utf8');
// ..continuous compartments (realistic degredation)
fs.writeFileSync(path.join(mDir, getName(id, 'c')),
keepOn(id), 'utf8');
// ..send message back through PIPE
fs.writeFileSync(path.join(mDir, getName(id, 'f')),
sendMessage(id), 'utf8');
// ..send message back through UDS on FS
fs.writeFileSync(path.join(mDir, getName(id, 'u')),
createSocket(id), 'utf8');
//modules[id] = module;
}
//console.log('eval')
//timer = process.hrtime();
//for (id = 0; id < MAX_NUM; id++) {
// discard[id] = eval(modules[id]);
//}
//stats["Eval"] = process.hrtime(timer);
//report();
//console.log('inproc')
//timer = process.hrtime();
//for (id = 0; id < MAX_NUM; id++) {
// discard[id] = utils.readFrom(modules[id]);
//}
//stats["InProc"] = process.hrtime(timer);
//
//
//
//console.log('fs')
//timer = process.hrtime();
//for (id = 0; id < MAX_NUM; id++) {
// discard[id] = require(mDir + getName(id, 'm'));
//}
//stats["FS"] = process.hrtime(timer);
//
//
//
//console.log('V8 Sandbox')
//var util = require('util');
//var vm = require('vm');
//var sandbox = {
// an: 'object',
// dummy: 'sandbox'
//};
//var code = 'var s = require(mDir + getName(id, "m"));'
//timer = process.hrtime();
//for (id = 0; id < MAX_NUM; id++) {
// // pass a new sandbox each time
// // todo can push them to an array
// vm.runInNewContext(code, {mDir: mDir, getName: getName, id: id, require: require});
// //console.log(util.inspect(sandbox));
// //discard[id] = require(mDir + getName(id, 'm'));
//}
//stats["V8sbx"] = process.hrtime(timer);
//
//
//
//console.log('proc')
//timer = process.hrtime();
//for (id = 0; id < MAX_NUM; id++) {
// if (id % 100 === 0) {
// console.log(id);
// }
// discard[id] = childProc.spawnSync('node', [getPath(id, 'p')]);
//}
//stats["Proc"] = process.hrtime(timer);
//report();
//// THIS REQUIRES ITS OWN GENERATION DUE TO KILL: kill -USR1 $PID
//console.log('procPar')
//id = 0;
//var count = 0;
//process.on('SIGUSR2', () => {
// count++;
// console.log(count)
// if (count > MAX_NUM / 10) {
// stats["proc-par"] = process.hrtime(timer);
// report();
// }
//});
//
//timer = process.hrtime();
//for (id = 0; id < MAX_NUM; id++) {
// if (id % 100 === 0) {
// console.log(id);
// }
// discard[id] = childProc.spawn('node', [getPath(id, 'c')]);
//}
//stats["proc-par"] = process.hrtime(timer);
//report();
//
// THIS REQUIRES ITS OWN GENERATION DUE TO KILL
console.log('continuous');
// kill -USR1 $PID
timer = process.hrtime();
id = 0;
process.on('SIGUSR2', () => {
//console.log('received sigusr2. spawning new process.');
spawn();
});
var spawn = function () {
if (id % 100 === 0) {
console.log(id);
}
if (id < MAX_NUM) {
discard[id] = childProc.spawn('node', [getPath(id, 'c')]);
id++;
} else {
stats["proc-cont"] = process.hrtime(timer);
//better send signals
console.log('pipe');
//runforks();
report()
}
}
spawn();
//var runForks = function () {
// timer = process.hrtime();
// id = 0;
// var fork = function () {
// var n = childProc.fork(getPath(id, 'f'));
// n.on('message', function (m) {
// //console.log('PARENT got message', m, 'from ', getPath(id, 'f'));
// if (id % 100 === 0) {
// console.log(id);
// }
// id++
// if (id < MAX_NUM) {
// fork()
// } else {
// // report and go to next phase
// stats["Proc-Pipe"] = process.hrtime(timer);
// report();
// // create children array and kill them all!
// }
// });
// };
// fork();
//}
//runForks();
//var sendData = function (sid, data) {
// var client = new net.Socket();
// client.connect(sid, function () {
// // FIXME: need to return after we connect
// client.write(data);
// });
//}
//
//var runUDS = function () {
// console.log('poll-uds')
// timer = process.hrtime();
// for (id = 0; id < MAX_NUM; id++) {
// if (id % 100 === 0) {
// console.log(id);
// }
// //console.log('launching ', getPath(id, 'u'));
// childProc.spawn('node', [getPath(id, 'u')], {detached: true, stdio: 'ignore'})
// //discard[id] = childProc.execSync('./async.sh', [getPath(id, 'u')], {detached: true, stdio: 'ignore'});
// //discard[id] = childProc.execSync('./async.sh', [getPath(id, 'u')]);
// var c = 0;
// var s = (id + '.sock');
// //console.log('polling for ', s);
// while (true) {
// // can add sleep
// //if (c++ % 100000 == 0) {
// // readline.clearLine(process.stdout, -1);
// // process.stdout.write('\r..' + c.toString());
// // //console.log('\r .. ' + c);
// //}
// if (fs.existsSync(s)) {
// break;
// }
// }
// sendData(s, 'ping');
// }
// stats["poll-uds"] = process.hrtime(timer);
// report();
//}
//runUDS();
////keep server open if needed
//var server = net.createServer(function(socket) {
// socket.write('Echo server\r\n');
// socket.pipe(socket);
//});
//
//server.listen(8080, '0.0.0.0');
# This script takes quite a while to complete
# assumes debian/ubuntu linux
# installing jq requires root.
# sudo apt-get install jq
# download counts are reported at npm.js
mkdir npm-stats; cd npm-stats;
touch stats.txt
wget 'http://registry.npmjs.org/-/all' -qO all.json
cat all.json | jq . >> beautified.json
echo "package count: " >> stats.txt
grep '^ ".*":' beautified.json | wc -l >> stats.txt
mkdir all-modules
COUNTER=0; cat all.json | jq -c 'to_entries[] | {"key": .key, "value": .value}' | while read line; do
#echo $COUNTER
echo $line | jq . > ./all-modules/$COUNTER.json; COUNTER=$((COUNTER + 1));
done
# This assumes that all packages have authors -- a large number of them do not
echo "author count: " >> stats.txt
find ./all-modules -name '*.json' | xargs jq '.value.maintainers? , .value.author?' | grep name >> stats.txt
# Ubuntu with Universe and Multivere enabled: 54426
echo "apt packages: " >> stats.txt
apt-cache search '' | wc -l >> stats.txt
{
"name": "plos17",
"version": "1.0.0",
"description": "This repository accompanies PLOS2017 submission #1 and contains additional data and scripts to reproduce all the results presented in the paper.",
"main": "boundaries.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "git+ssh://git@gist.github.com/10904b2628e87f68bdd180dcdf64c7a7.git"
},
"keywords": [],
"author": "",
"license": "ISC",
"bugs": {
"url": "https://gist.github.com/10904b2628e87f68bdd180dcdf64c7a7"
},
"homepage": "https://gist.github.com/10904b2628e87f68bdd180dcdf64c7a7",
"dependencies": {
"datagram-stream": "^1.1.1"
}
}
#!env node
var fs = require('fs');
var utils = require('./utils.js');
var ph = require('pretty-hrtime');
var contents = fs.readFileSync('results.txt', 'utf-8');
var startGenerator = contents.match(/start:generator (\[.*\])/g);
var endGenerator = contents.match(/end:generator (\[.*\])/g);
var startConsumer = contents.match(/start:consumer (\[.*\])/g);
var endConsumer = contents.match(/end:consumer (\[.*\])/g);
var experiment = contents.match(/experiment.*file.*/g);
//TODO: we should be able to capture the whole pipeline and
//create nice waterfalls diagrams:)
if (startGenerator.length < 1 || endGenerator.length < 1 ||
startConsumer.length < 1 || endConsumer.length < 1) {
console.log('There is either something missing or more than one')
process.exit();
}
// there might be more than one experiments -- take the latest
startGenerator = startGenerator[startGenerator.length - 1].match(/start:generator (\[.*\])/)[1];
endGenerator = endGenerator[endGenerator.length - 1].match(/end:generator (\[.*\])/)[1];
startConsumer = startConsumer[startConsumer.length - 1].match(/start:consumer (\[.*\])/)[1];
endConsumer = endConsumer[endConsumer.length - 1].match(/end:consumer (\[.*\])/)[1];
experiment = experiment[experiment.length - 1].replace(/experiment: file/,'').replace(/\"/g,'') ;
var startLatency = utils.diffHrtime(JSON.parse(startGenerator), JSON.parse(startConsumer));
var endLatency = utils.diffHrtime(JSON.parse(endGenerator), JSON.parse(endConsumer));
var e2e = utils.diffHrtime(JSON.parse(startGenerator), JSON.parse(endConsumer));
//var diff2 = utils.diffHrtime(JSON.parse(end[1]), JSON.parse(start[1]));
//console.log('diff', diff1, diff2);
console.log('startLatency: ', ph(startLatency, {precise:true}));
console.log('endLatency: ', ph(endLatency, {precise:true}));
console.log('end-to-end: ', ph(e2e, {precise:true}));
console.log('experiment: ', experiment.trim());
#!env node
/**
* requires a sth.json as an input
*/
var fs = require('fs');
// returns direct, total, dependencies as well as depth of the
// decency tree
var info = function (obj) {
if (!obj.dependencies) {
return {depth: 0, direct: 0, total: 0}
}
var deps = obj.dependencies;
//var keys = Object.keys(obj);
var depth = 0;
var total = 0;
var direct = Object.keys(deps).length;
for (var d in deps) {
//console.log(d, deps[d])
if (!deps.hasOwnProperty(d)) continue;
var subtree = info(deps[d])
depth = Math.max(depth, (subtree.depth + 1));
total += (subtree.total + 1);
};
// winston 3, 10, 18
return {depth: depth, direct: direct, total: total}
}
var parse = function (source) {
try {
source = JSON.parse(source);
} catch (e) {
console.error("File does not look like valid JSON", e);
process.exit(1);
}
return source;
}
var print = function (obj) {
console.log(JSON.stringify(obj));
for (var k in obj) {
console.log(k + ':', obj[k]);
}
}
if (require.main === module) {
// parse arguments
if (process.argv.length > 2) {
var fname = process.argv[2];
if (!fs.existsSync(fname)) {
console.error("File does not exist", fname);
}
// it will choke if too large!
var source = fs.readFileSync(fname);
var jsrc = parse(source);
print(info(jsrc));
} else {
var readline = require('readline');
var source = "";
var rl = readline.createInterface({
input: process.stdin,
output: process.stdout,
terminal: false
});
rl.on('line', function (data) {
source += data + "\n";
});
rl.on('close', function () {
var jsrc = parse(source);
print(info(jsrc));
});
}
} else {
module.exports = info
}
#!/bin/bash
# collect dependency statistics for various types of applications
# applications from https://github.com/sindresorhus/awesome-nodejs
# REQUIRES some kind of `sloc` command
# e.g., sudo npm install -g sloc
command -v sloc >/dev/null 2>&1 ||
{ echo >&2 "sloc required but not installed. Aborting."; exit 1; }
# requires node, npm, sloc
# can change environmental variable
HERE=$(pwd)
REPORT=${REPORT:-$HERE/report.txt}
APPS=${APPS:-$HERE/apps}
JSFS=/tmp/JSFS
touch $REPORT
#mkdir $(pwd)/reports
TEMPLATE='| %15s | %8s | %5s | %5s | %5s | %5s | %5s | %5s | %10s |\n'
HEADER="$(printf "${TEMPLATE}" Application Direct Total Files Depth DLoC MLOC TMLOC LoC/File)"
echo "$HEADER" >> $REPORT
echo "$HEADER" | tr ' /' '-' | tr a-zA-Z '-' >> $REPORT
function shrinkage {
npm ls --silent --only=production --parseable > PROD.txt
echo "total: $(cat PROD.txt | wc -l)"
echo "dedup: $(sed 's;/.*/;;' PROD.txt | sort | uniq -c | wc -l)"
}
function direct_slocount {
rm -rf $JSFS
mkdir $JSFS
find . -type f -name '*.js' -not -path './node_modules/*' -exec cp '{}' $JSFS/ \;
sloc $JSFS | grep Source | grep -o ' [0-9]*$' | tr -d '[:space:]'
}
function module_slocount {
rm -rf $JSFS
mkdir $JSFS
find ./node_modules -type f -name '*.js' -exec cp '{}' $JSFS/ \;
sloc $JSFS | grep Source | grep -o ' [0-9]*$' | tr -d '[:space:]'
}
function total_slocount {
rm -rf $JSFS
mkdir $JSFS
find . -type f -name '*.js' -exec cp '{}' $JSFS/ \;
sloc $JSFS | grep Source | grep -o ' [0-9]*$' | tr -d '[:space:]'
}
function remove_modules {
rm -rf node_modules
}
function generateDeps {
npm ls --silent --only=production --json > PROD.json
npm ls --silent --only=production --parseable > PROD.txt
}
function report {
echo 'pizza'
pkg=$(basename $1 | sed 's;/;;')
echo $pkg
# fixme run a slocount first
if [[ $pkg == 'vscode' || $pkg == 'vscode' ]]; then
./scripts/npm.sh install
else
echo 'installing production modules'
npm install --only=production > dep_tree
fi
# remove_modules
npm install --only=production > dep_tree
STATS=$(npm ls --silent --only=production --json | node $HERE/stats.js)
# using two top to verify results
# DIRECT1=$(npm ls --depth=0 | grep '─' | wc -l | tr -d '[:space:]')
# DIRECT2=$(jq '.dependencies | length' package.json | tr -d '[:space:]')
echo "$pkg" "$STATS"
DIRECT=$(echo "$STATS" | grep '^direct' | sed 's/^direct: //')
TOTAL=$(echo "$STATS" | grep '^total' | sed 's/^total: //')
DEPTH=$(echo "$STATS" | grep '^depth' | sed 's/^depth: //')
# TOTAL1=$(grep '─' dep_tree | wc -l | tr -d '[:space:]')
# TOTAL2=$(npm ls | grep '─' | wc -l | tr -d '[:space:]')
FILES=$(find . -type f -name '*.js' | wc -l | tr -d '[:space:]')
MSLOC=$(module_slocount | tr -d '[:space:]')
DSLOC=$(direct_slocount | tr -d '[:space:]')
TSLOC=$(total_slocount | tr -d '[:space:]')
RATIO=$(echo "scale=1; $SLOC/$FILES" | bc -l)
# rm -rf node_modules
# npm install > dep_tree
# DEV_TOP1=$(npm ls --depth=0 | grep '─' | wc -l | tr -d '[:space:]')
# #DEV_TOP2=$(jq '.dependencies | length' package.json | tr -d '[:space:]')
# DEV_DEEP1=$(grep '─' dep_tree | wc -l | tr -d '[:space:]')
# # DEV_DEEP2=$(npm ls | grep '─' | wc -l | tr -d '[:space:]')
# DEV_TOTAL=$(find . -type f -name '*.js' | wc -l | tr -d '[:space:]')
echo \n
printf "$TEMPLATE" $pkg $DIRECT $TOTAL $FILES $DEPTH $DSLOC $MSLOC $TSLOC $RATIO
echo \n
sleep 2
printf "$TEMPLATE" $pkg $DIRECT $TOTAL $FILES $DEPTH $DSLOC $MSLOC $TSLOC $RATIO >> $REPORT
}
function clone {
cat .git/config | grep url | sed 's/^.*= /git clone /'
}
echo "APPS directory " $APPS
cd $APPS
for type in */; do
t=$(basename $type)
echo "Group: " $t
cd $t
printf "$TEMPLATE" $t '' '' '' '' '' '' >> $REPORT
for pkg in */; do
echo $pkg
cd $pkg
#generateDeps
report $pkg
cd ..
done
cd ..
done
/*
This file is part of the Andromeda codebase. Licensed under GPL v.2.
https://github.com/andromeda/andromeda/blob/master/src/utils/utils.js
https://github.com/andromeda/andromeda/blob/master/src/utils/serialization.js
*/
"use strict";
var url = require('url');
var crypto = require('crypto');
var regex = {
IPv4: /^(?!.*\.$)((1?\d?\d|25[0-5]|2[0-4]\d)(\.|$)){4}$/
};
var Color = {
BOLD: [1, 22]
, ITALIC: [3, 23]
, UNDERLINE: [4, 24]
, INVERSE: [7, 27]
, WHITE: [37, 39]
, GREY: [90, 39]
, BLACK: [30, 39]
, BLUE: [34, 39]
, CYAN: [36, 39]
, GREEN: [32, 39]
, MAGENTA: [35, 39]
, RED: [31, 39]
, YELLOW: [33, 39]
};
Color.ize = function (color, s) {
return "\u001b[" + color[0] + "m" + s + "\u001b[" + color[1] + "m";
};
var C256 = {
SGREY: [60, 39] // #708090, slategray
, BEIGE: [231, 39] // #f8f8f2, beige
, PINK: [204, 39] // #f92672, pink
, MOEUVE: [147, 39] // #ae81ff, moeuve
, LIMGRN: [149, 39] // #a6e22e, limegreen
, MSTRD: [222, 39] // #e6db74, mustard
, CYAN: [117, 39] // #66d9ef, cyan
, ORANGE: [215, 39] // #fd971f, orange
, BOLD: [1, 22]
, UNDERLINE: [4, 24]
};
C256.ize = function (color, s) {
if (color[0] < 10) {
return "\u001b[" + color[0] + "m" + s + "\u001b[" + color[1] + "m";
}
return '\u001b[38;5;' + color[0] + 'm' + s + '\u001b['+ color[1] + 'm';
};
C256.l8r = function (color) {
return function (s) {
if (color[0] < 10) {
return "\u001b[" + color[0] + "m" + s + "\u001b[" + color[1] + "m";
}
return '\u001b[38;5;' + color[0] + 'm' + s + '\u001b['+ color[1] + 'm';
};
};
Color.printSpectrum = function () {
for (var c in Color) {
if (require('util').isArray(Color[c])) {
console.log(Color.ize(Color[c], ('this is ' + c)));
}
}
};
function coerce(val) {
if (val instanceof Error) return val.stack || val.message;
return val;
}
// taken from lodash
function isNative(value) {
// Used to resolve the internal `[[Class]]` of values
var toString = Object.prototype.toString;
// Used to resolve the decompiled source of functions
var fnToString = Function.prototype.toString;
// Used to detect host constructors (Safari > 4; really typed array specific)
var reHostCtor = /^\[object .+?Constructor\]$/;
// Compile a regexp using a common native method as a template.
// We chose `Object#toString` because there's a good chance it is not being mucked with.
var reNative = RegExp('^' +
// Coerce `Object#toString` to a string
String(toString)
// Escape any special regexp characters
.replace(/[.*+?^${}()|[\]\/\\]/g, '\\$&')
// Replace mentions of `toString` with `.*?` to keep the template generic.
// Replace thing like `for ...` to support environments like Rhino which add extra info
// such as method arity.
.replace(/toString|(function).*?(?=\\\()| for .+?(?=\\\])/g, '$1.*?') + '$'
);
var type = typeof value;
if (type === 'function') {
// Use `Function#toString` to bypass the value's own `toString` method
// and avoid being faked out.
return reNative.test(fnToString.call(value));
} else {
// Fallback to a host object check because some environments will represent
// things like typed arrays as DOM methods which may not conform to the
// normal native pattern.
return (value && type === 'object' && reHostCtor.test(toString.call(value)));
}
}
function addForward(uri1, uri2) {
var destructured = url.parse(uri1, true);
destructured.query.forward = uri2;
delete destructured.search;
return url.format(destructured);
}
function splitIntoForward(uri) {
var u = url.parse(uri, true);
if (u.query && u.query.forward) {
var s = u.query.forward;
delete u.query.forward;
delete u.search;
return {uri: url.format(u), forward: s};
}
return uri;
}
function multiForward(listUris) {
if (!(listUris instanceof Array) || (listUris.length < 1)) {
throw new Error('multiForward argument is not an array (or is empty)');
}
var slice = listUris.slice(1);
var suffix = (slice.length === 1)? slice : multiForward(slice);
return addForward(listUris[0], suffix);
}
var timestamp = function timestamp() {
var d = new Date();
var time = [d.getHours(), d.getMinutes(), d.getSeconds(), ("000" + d.getMilliseconds()).slice(-3)].join(':');
return '[' + time + ']';
};
var generateError = function (where) {
var backtrace = new Error();
return function(err) {
if (err) {
backtrace.stack = err.name + ': ' + err.message +
backtrace.stack.substr(backtrace.name.length);
throw backtrace;
}
};
};
function ensureCallback(cb) {
return typeof cb === 'function' ? cb : function () {};
}
var typeOf = function(v) {
return ({}).toString.call(v).match(/\s([a-zA-Z]+)/)[1].toLowerCase();
};
var toHumanReadable = function(bytes) {
if (bytes < 1024) {
return bytes + " B";
}
var exp = Math.floor(Math.log(bytes) / Math.log(1024));
var pre = "KMGTPE".charAt(exp-1);
return require('util').format("%d%sB", (bytes / Math.pow(1024, exp)).toFixed(1), pre);
};
// (placeholder) -- modified from https://gist.github.com/jed/982883
var uuid4 = function (a) {
return a? (a^crypto.randomBytes(1)[0] % 16 >> a/4).toString(16) :
([1e7] + -1e3 + -4e3 + -8e3 + -1e11).replace(/[018]/g, uuid4);
};
var uuid5 = function (str) {
var digest = crypto.createHash('sha256').update(str).digest("hex");
digest = digest.slice(0, 8) + '-' + digest.slice(9, 13) + '-5' +digest.slice(14, 17) + '-' + digest.slice(18, 22) + '-' + digest.slice(22, 34);
return digest;
};
var isNode = function (obj) {
return (obj && obj.host && obj.port && typeof obj.host === 'string' && typeof obj.port === 'number');
};
var toNodeId = function (obj) {
if (!isNode(obj)) {
return null;
}
return uuid5(obj.host + ':' + obj.port);
};
var toNode = function (obj) {
if (!isNode(obj)) {
return null;
}
return {host: obj.host, port: obj.port};
};
var toArgs = function (varargs) {
return arguments;
};
var toCall = function (obj) {
if (typeof obj !== 'object') {
return false;
}
var result = [];
for (var k in obj) {
var num = Number.parseInt(k, 10);
if (num.toString() === k) {
result[num] = obj[k];
} else {
return false;
}
}
if (result.length === Object.keys(obj).length) {
return result;
} else {
return false;
}
};
var getIP4 = function () {
//based on http://stackoverflow.com/a/8440736
var ifaces = require('os').networkInterfaces();
var res = [];
Object.keys(ifaces).forEach(function (ifname) {
var alias = 0;
ifaces[ifname].forEach(function (iface) {
if ('IPv4' !== iface.family || iface.internal !== false) {
// skip over internal (i.e. 127.0.0.1) and non-ipv4 addresses
return;
}
if (alias >= 1) {
// this single interface has multiple ipv4 addresses
res.push({alias: alias, address: iface.address});
} else {
// this interface has only one ipv4 adress
res.push({address: iface.address});
}
++alias;
});
});
return res;
};
var defaultOpts = function (obj, prop, df) {
// if not object (i.e., a string or regex or..) then assign to p the value
// if null and there is a default value, use this!
var p = (typeOf(obj) !== 'object' && typeOf(obj) !== 'undefined' &&
typeOf(obj) !== 'null') ? obj : ((obj && obj[prop]) ? obj[prop] : df);
obj = (typeOf(obj) === 'object')? obj : {};
obj[prop] = p;
return obj;
};
var defaultCallback = function (cb, df) {
return cb || df || function () {};
};
// colors are pluggable
// fixme -- simply add a beautify options that expands newline
var stringify = function(code, options) {
var defaultOptions = {
name: '',
depth: 0,
expandBoxedPrimitives: false,
traverseHierarchy: false,
overwritePrompt: false,
colors: false
};
defaultOptions.palette = [
'str', 'symbol', 'fun', 'num', 'null', 'undefined', 'boxed', 'regexp', 'key'
].reduce(function (o, i) {
// each type gets a function that simply returns the contents
o[i] = function (s) {return s;};
return o;
}, {});
options = merge(defaultOptions, options);
var overwrite = options.overwritePrompt;
options.overwritePrompt = false;
var s;
if (typeof code === 'function') {
// fixme for built-ins, we can't get a name; could have a lookup map
s = code.toString();
if (options.colors) {
s = options.palette.fun(s);
}
} else if (typeof code === 'string') {
s = '"' + code.toString() + '"';
if (options.colors) {
s = options.palette.str(s);
}
} else if (typeof code === 'number') {
s = code.toString();
if (options.colors) {
s = options.palette.num(s);
}
} else if (typeof code === 'object') {
var sty = typeOf(code);
if (sty === 'array') {
s = '[';
var cm = (code.length > 3)? ',\n' : ', ';
for (var i = 0; i < code.length; i++) {
s += stringify(code[i], options) + cm;
}
s = (code.length > 0)? s.slice(0, s.length - 2) + ']' : '[]';
} else if (sty === 'object' || sty === 'arguments') {
s = '{';
var num = 0;
var v;
// checking if there is budget for a single line using a heuristic:
// lineSize - sizeOfKeys - (sizeOfFirstValue * numberOfValues)
var budget = 100 - Object.keys(code).join(' ').length -
(stringify(code[Object.keys(code)[0]], options).length *
Object.keys(code).length);
for (var key in code) {
if (!options.traverseHierarchy && !code.hasOwnProperty(key)) {
continue;
}
v = stringify(code[key], options);
v = (v.split('\n').length > 1)? ('\n' + v.replace(/^/gm,' ')) : v;
var sk = options.colors? options.palette.key('"'+key+'"') : key;
var comma = (budget < 0 || Object.keys(code).length > 2)? ',\n' : ', ';
s += (num === 0? '' : ' ') + sk + ': ' + v + comma;
num++;
}
var l = (s.length - 2 <= 2)? 1 : s.length - 2;
s = s.slice(0, l) + ((l == 1)? '}' : '}');
} else if (typeOf(code) === 'null') {
s = 'null';
if (options.colors) {
s = options.palette.null(s);
}
} else if (typeOf(code) === 'regexp') {
s = code.toString();
if (options.colors) {
s = options.palette.regexp(s);
}
// arguments object, date, other built-in objects
} else {
if (options.expandBoxedPrimitives) {
s = 'new ' + typeOf(code).charAt(0).toUpperCase() +
typeOf(code).slice(1) + '("' + code.toString() + '");';
if (options.colors) {
s = options.palette.boxed(s);
}
} else {
s = '"' + code.toString() + '"';
}
}
} else if (typeof code === 'symbol') {
s = code.toString();
if (options.colors) {
s = options.palette.symbol(s);
}
} else if (typeof code === 'boolean') {
s = code.toString();
if (options.colors) {
s = options.palette.boolean(s);
}
} else {
if (code === undefined) {
s = "undefined";
if (options.colors) {
s = options.palette.undefined(s);
}
} else if (code === null) {
s = "null";
if (options.colors) {
s = options.palette.null(s);
}
} else {
if (code.toString) {
s = code.toString();
} else {
s = 'EXCEPTION';
// what to do if it does not have toString?
}
}
}
return padString(s, overwrite);
};
var typeAsString = function (o, x) {
if (o === undefined) {
return '';
}
var MAX_LINES = 1;
var MAX_LENGTH = 72;
var ty = typeOf(o);
x = x || ' :: ';
o = stringify(o);
if (o.length > MAX_LENGTH) {
o = o.slice(0, 20) + '[..truncated..]' + o.slice(-20);
o = o.replace(/\s/, ' ');
}
var lines = o.split('\n');
// if longer than 8 lines
var res = [];
if (lines.length > MAX_LINES) {
for (var i = 0; i < Math.floor(MAX_LINES / 2); i++) {
res.push(lines[i]);
}
res.push('[..multiple lines..]');
for (i = lines.length - Math.floor(MAX_LINES / 2); i < lines.length; i++) {
res.push(lines[i]);
}
} else {
res = lines;
}
o = '@base ' + res.join('\n') + x + ty;
return Color.ize(Color.GREY, o);
};
var isCyclic = function (obj) {
try {
JSON.stringify(obj);
return false;
} catch (e) {
return true;
}
};
var detectCycle = function (obj) {
var seenObjects = [];
function detect (obj) {
if (!isCyclic(obj)) {
return {isCyclic: false};
}
if (seenObjects.indexOf(obj) !== -1) {
return true;
}
seenObjects.push(obj);
for (var key in obj) {
if (obj.hasOwnProperty(key) && detect(obj[key])) {
return {isCyclic: true, obj: obj, key: key};
}
}
}
return detect(obj);
};
var clone = function(src) {
return JSON.parse(JSON.stringify(src));
};
var _checkCircular = function(e, o1, o2) {
if (e.name === 'RangeError') {
var inspect = require('util').inspect;
return (inspect(o1) === inspect(o2));
}
return false;
};
var equal = function (o1, o2) {
try {
require('assert').deepEqual(o1, o2);
return true;
} catch (e) {
return _checkCircular(e, o1, o2);
}
};
var strictEqual = function (o1, o2) {
try {
require('assert').deepStrictEqual(o1, o2);
return true;
} catch (e) {
return _checkCircular(e, o1, o2);
}
};
// on type mismatch, use approriate resolution or, by default, return o1 -- so
// that defaults are never overwritten!
// but if type is the same (and not object -- i.e., no recursion) pick o2!
// options:
// * onTypeMismatch: what do do if the two objects are not of the same type
// * onType[x] = function (o1, o2): what to do on type mismatches
// * onCycle: what to do if any of the object has a cycle
// * clone: [false] clone object before anything
var merge = function (o1, o2, options) {
options = options || {};
// merge(1, [1,2,3]) -> [1,2,3]
if (typeOf(o1) !== typeOf(o2)) {
if (options.onTypeMismatch) {
return options.onTypeMismatch(o1, o2);
}
return o1;
}
if (isCyclic(o1) || isCyclic(o2)) {
if (options.onCycle) {
return options.onCycle(o1, o2);
}
return o1;
}
if (options.clone) {
o1 = clone(o1);
o2 = clone(o2);
}
// merge(1, 2) -> 2
if (typeOf(o1) !== 'object') {
// at this point o1 and o2 have the same type
if (options.onType && options.onType[typeOf(o1)]) {
return options.onType[typeOf(o1)](o1, o2);
}
return o2;
}
var result = options.clone? clone(o1) : o1;
if (options.onType && options.onType['object']) {
// user can decide whether to use pre-populated 'result'
return options.onType['object'](o1, o2, options, result);
}
options.clone = false;
for (var k in o2) {
// we won't keep cloning recursively!
result[k] = result[k]? merge(result[k], o2[k], options) : o2[k];
}
return result;
};
// Accept a string of the forms:
// * '[.]one.two.three[.]'
// * '[/]one/two/three[/]'
// ..and creates a similar property on the value
var createProperty = function (obj, propertyString, value) {
var props;
if (propertyString.indexOf('.') >= 0) {
props = propertyString.replace(/^\.|\.$/g, '').split('.');
} else if (propertyString.indexOf('/') >= 0) {
props = propertyString.replace(/^\/|\/$/g, '').split('/');
} else {
props = [propertyString];
}
var tmp = obj;
for (var i =0; i < props.length; i++) {
if (!tmp.hasOwnProperty(props[i]) || (typeof tmp[props[i]] !== 'object')) {
tmp[props[i]] = {};
if ((i === props.length - 1) && (arguments.length === 3)) {
tmp[props[i]] = value;
}
}
tmp = tmp[props[i]];
}
return obj;
};
var unix = {
// fixme: need to add asynchronous
exec: function (cmd) {
if (/^win/.test(require('os').platform)) {
return new Error('Not on Unix');
}
var exec = require('child_process').execSync;
var s;
try {
s = exec(cmd, {stdio: ['pipe', 'pipe', 'stdio']}).toString();
return s;
} catch (e) {
e.stdio = s;
return e;
}
}
};
var percentOf = function (part, total) {
return ((part / total) * 100).toFixed(2) + "%"
};
var padString = function (str, len, c) {
c = c || ' ';
return len? ((str.length < len) ? padString(str + c, len) : str) : str;
};
var escape = function (key) {
// we are encoding spaces, slashes etc.
return encodeURIComponent(key);
};
var unescape = function (key) {
// we are encoding spaces, slashes etc.
return encodeURIComponent(key);
};
var getStats = function (arr) {
// trivial stats -- to be extended soon
var sum = 0;
for (var i = 0; i < arr.length; i++) {
sum += arr[i];
}
return {total: arr.length, avg: sum / arr.length}
};
var toMicros = function (t) {
return (t[0] * 1000000 + t[1]/1000);
}
var toMillis = function (t) {
return (t[0] * 1000 + t[1]/1000000);
}
var defOpts = {
name: '',
expandBoxedPrimitives: false,
colors: false
};
var readFrom = function readFrom(code, options) {
// in any other case return the value
if (typeof code !== 'string') {
return code;
}
options = merge(defOpts, options);
var Module = module.constructor;
var path = require('path');
var paths = Module._nodeModulePaths(path.dirname(options.name));
var m = new Module(options.name, module.parent);
//TODO, this might actually require a resolution protocol
m.filename = options.name;
m.paths = [].concat(paths);
m._compile(code, options.name);
return m.exports;
};
// offer a minification option
var storeOn = function storeOn (code, options) {
options = merge(defOpts, options);
var s = stringify(code, options);
if (options.wrap) {
s = 'function (andromeda) {\n return ' + s + '}';
}
if (options.context && typeof options.context === 'object') {
var ctx = 'function (andromeda) {\n var context = ' +
stringify(options.context).replace(/\n/g, '\n ') + ';';
s = s.replace(/^.*/, ctx);
}
if (options.name && typeof options.name === 'string' && options.name !== '') {
return 'module.exports.' + options.name + ' = ' + s + ';';
} else {
return 'module.exports = ' + s + ';';
}
};
// Impl. 1
var Transform = require('stream').Transform;
require('util').inherits(Identity, Transform);
var instances = 0;
var ids = [];
function Identity(options, tOptions) {
if (!(this instanceof Identity))
return new Identity(tOptions);
Transform.call(this, tOptions);
// stream statistics
// pattern match stream for end to output stats to stderr
this._options = options;
this._counter = 0;
this._extras = '';
this._reported = false;
this.totalLatency = -99;
this.id = 0;
};
Identity.prototype._transform = function(chunk, encoding, done) {
//console.error(chunk.toString().length)
// pattern match stream for end to output stats to stderr
//log('\n pushing from options.port' + this._options.port);
if (this._options.once) {
if (this.totalLatency === -99) {
console.log(this._options.once)
this.totalLatency = process.hrtime();
this.id = instances;
ids[instances] = this.totalLatency;
console.log('=', ids);
console.log('start time', this.totalLatency);
instances++;
}
} else {
if (!this._reported) {
this._reported = true;
var t = process.hrtime();
var s = "\n" + this._options.port.toString() + ' start:' + this._options.tag + " [" + t.toString() + "]\n";
logAsync(s);
}
}
this._counter++;
this.push(chunk);
done();
};
Identity.prototype._flush = function(done) {
if (this._options.once) {
// it seems it's flushing twice?
if (instances > 1) {
instances--;
} else {
console.log(instances, ids[0]);
console.log(instances, 'instance diff', process.hrtime(this.totalLatency));
console.log(instances, 'maxx diff', process.hrtime(ids[0]));
}
} else {
var t = process.hrtime();
console.error(this._counter, t, this._options);
// TODO add "generator" for generator stage, so that we parse it later
var s = "\n" + this._options.port.toString() + ' end:' + this._options.tag + " [" + t.toString() + "]\n"
logAsync(s);
}
//this.push(null);
done();
};
//// Impl. 2
//var id = {
// transform: function(chunk, encoding, next) {
// //console.error(chunk.toString().length)
// //this.push(chunk)
// // sets this._transform under the hood
// // generate output as many times as needed
// // this.push(chunk);
// // call when the current chunk is consumed
// next();
// },
// flush: function(done) {
// // sets this._flush under the hood
// // generate output as many times as needed
// //console.error('done')
// this.push('done');
// done();
// }
//}
////var T = stream.Transform;
////var transform = new T(id);
var toEnum = function (arr) {
return arr.reduce(function (obj, e) {
obj[e.toString().toUpperCase()] = e;
return obj;
}, {})
}
var logFile = 'results.txt';
var log = function (data, f) {
f = f || logFile
try {
require('fs').appendFileSync(f, stringify(data));
} catch (e) {
require('fs').appendFile('SOMETHING-WENT-TERRIBLY-WRONG', (e.toString() + e.stack.toString()));
}
};
var logAsync = function (data, f) {
f = f || logFile
require('fs').appendFile(f, stringify(data));
};
// from: github.com/firefoxes/diff-hrtimear
var diffHrtime = function(b, a){
var as = a[0], ans = a[1],
bs = b[0], bns = b[1],
ns = ans - bns,
s = as - bs;
if (ns < 0) {
s -= 1
ns += 1e9
}
return [s, ns]
};
module.exports.regex = regex;
// FIXME: utils should be able to choose, given a param
module.exports.Color = Color;
module.exports.C256 = C256;
module.exports.coerce = coerce;
module.exports.isNative = isNative;
module.exports.addForward = addForward;
module.exports.multiForward = multiForward;
module.exports.splitIntoForward = splitIntoForward;
module.exports.timestamp = timestamp;
module.exports.generateError = generateError;
module.exports.ensureCallback = ensureCallback;
module.exports.typeOf = typeOf;
module.exports.toHumanReadable = toHumanReadable;
module.exports.uuid4 = uuid4;
module.exports.uuid5 = uuid5;
module.exports.isNode = isNode;
module.exports.toNodeId = toNodeId;
module.exports.toNode = toNode;
module.exports.toArgs = toArgs;
module.exports.toCall = toCall;
module.exports.getIP4 = getIP4;
module.exports.defaultCallback = defaultCallback;
module.exports.defaultOpts = defaultOpts;
module.exports.stringify = stringify;
module.exports.typeAsString = typeAsString;
module.exports.detectCycle = detectCycle;
module.exports.equal = equal;
module.exports.clone = clone;
module.exports.createProperty = createProperty;
module.exports.unix = unix;
module.exports.percentOf = percentOf;
module.exports.padString = padString;
module.exports.escape = escape;
module.exports.unescape = unescape;
module.exports.getStats = getStats;
module.exports.toMicros = toMicros;
module.exports.toMillis = toMillis;
module.exports.storeOn = storeOn;
module.exports.readFrom = readFrom;
module.exports.Identity = Identity;
module.exports.toEnum = toEnum;
module.exports.merge = merge;
module.exports.log = log;
module.exports.logAsync = logAsync;
module.exports.diffHrtime = diffHrtime;
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment