Skip to content

Instantly share code, notes, and snippets.

@artze
artze / canvas-sketch_init.sh
Created September 4, 2021 06:29
canvas-sketch project init script
#!/bin/sh
# Init prettier config
cat <<EOF > ./.prettierrc.js
module.exports = {
arrowParens: "always",
printWidth: 100,
semi: true,
singleQuote: false,
tabWidth: 2,
@artze
artze / ffmpegcmd.txt
Created September 4, 2021 06:27
Frequently used ffmpeg cmd
// Uses baseline mode of h264 encoding, and ensures that dimensions are divisible by 2
ffmpeg -r 30 -f image2 -i "%04d.png" -c:v libx264 -profile:v baseline -pix_fmt yuv420p -vf "scale=trunc(iw/2)*2:trunc(ih/2)*2" -an -crf 17 output.mp4
// with 2x speed
ffmpeg -r 30 -f image2 -i "%04d.png" -c:v libx264 -profile:v baseline -pix_fmt yuv420p -vf "scale=trunc(iw/2)*2:trunc(ih/2)*2" -an -crf 17 -filter:v "setpts=0.5*PTS" ./mp4/output-2.mp4
for(var i = 0; i < 3; i++) {
setTimeout(function() {
console.log(i)
}, 200)
}
// output: 3
// output: 3
// output: 3
@artze
artze / sequentialExecStreamsConcatFiles.js
Created June 12, 2019 17:30
Sequential Execution with Streams. A basic example that concatenates contents of multiple files.
const fromArray = require('from2-array');
const through = require('through2');
const fs = require('fs');
const concatFiles = (destination, files, callback) => {
const destStream = fs.createWriteStream(destination);
fromArray.obj(files) // [1]
.pipe(through.obj((file, encoding, done) => { // [2]
const src = fs.createReadStream(file); // [3]
src.pipe(destStream, { end: false }); // [4]
@artze
artze / transfromStream.js
Created June 11, 2019 15:15
A transform Stream example with string replace feature
const stream = require('stream');
class ReplaceStream extends stream.Transform {
constructor(searchString, replaceString) {
super();
this.searchString = searchString;
this.replaceString = replaceString;
this.tailPiece = '';
}
@artze
artze / createWritableStream.js
Created June 9, 2019 08:52
Create a custom writable stream that creates text files
const stream = require('stream');
const fs = require('fs');
const path = require('path');
const mkdirp = require('mkdirp');
class ToFileStream extends stream.Writable {
constructor() {
super({ objectMode: true }); // [1]
}
@artze
artze / writableBackpressure.js
Created June 9, 2019 08:13
Basic writable stream backpressure example
const Chance = require('chance');
const http = require('http');
const chance = new Chance();
http.createServer((req, res) => {
res.writeHead(200, { 'Content-Type': 'text/plain' });
const generateMore = () => {
while(chance.bool({ likelihood: 95 })) {
let shouldContinue = res.write(
@artze
artze / writableStreamHttpServer.js
Created June 9, 2019 06:34
Basic example of writable stream in http server
const Chance = require('chance');
const http = require('http');
const chance = new Chance();
http.createServer((req, res) => {
res.writeHead(200, { 'Content-Type': 'text/plain' });
while(chance.bool({ likelihood: 95 })) {
res.write(chance.string() + '\n');
}
res.end('\n The end... \n')
@artze
artze / creatingReadableStream.js
Created June 8, 2019 08:44
Create a Readable Stream implementation that generates random strings
const stream = require('stream');
const Chance = require('chance');
const chance = new Chance();
class RandomStream extends stream.Readable {
constructor(options) {
super(options);
}
_read(size) {
@artze
artze / flowingReadableStream.js
Created June 8, 2019 08:02
Flowing readable stream basic example
process.stdin
.on('data', (chunk) => {
console.log(`Chunk read: (${chunk.length}) "${chunk.toString()}"`);
})
.on('end', () => process.stdout.write('End of stream'));