Skip to content

Instantly share code, notes, and snippets.

@imjul1an
Last active September 4, 2017 12:14
Show Gist options
  • Save imjul1an/d5012dacc38e171c1ccffbb3e30166b2 to your computer and use it in GitHub Desktop.
Save imjul1an/d5012dacc38e171c1ccffbb3e30166b2 to your computer and use it in GitHub Desktop.
Quick intro the main concepts of the Node.js

Event Loop.

Event Loop does a simple job - it orchestrates the work between the Call Stack and Event Queue untill the Event Queue is empty.

Call Stack

  • It is a simple list of functions.
  • A Stack is a simple FILO (First In Last Out) data structure.
  • The top element that we can pop out, is the last element that we pushed into it (stack).
  • Since Node is a Single Threaded, we have only one stack.

Event Queue / Messege Queue / Callback Queue

  • It's a list of things (events) to be processed.
  • A Queue data structure is the FIFO (First In First Out).
  • The first event we queue will be the first event wll be de-queued.
  • To de-queue and process an event from the edge of the Queue, we just invoke a function assosiated with it.
function foo() {
   setTimout(() => { // This callback is putted on the Call Stack
      console.log('process after 2 sec.'); // when it's done after 2 seconds it will be placed into the Queue.
                                           // Exactly at this moment the Event Loop has smth. important todo,
                                           // The Event Loop job is very simple - it monitors the stack of Call Stack
                                           // and Event Queue. When the Call Stack is empty and the Event Queue is not
                                           // it will de-queue the event from the Event Queue and place it on the Call Stack.
                                           // Call Stack will invoke this callback for assosiative function and pop out this 
                                           // callback out of the Call Stack. Call Stack and Event Queue are empty now, event
                                           // so, Event Loop doesn't need to process anymore. All Node APIs works with this
                                           // concept.
   }, 2000);
}

Node's Event Driven Architecture

Callbacks, Promisses, Async/Await.

Event Emitter

  • Event Emitter is a module that facilitiates communication between the objects in Node.
  • Event Emitter is at the core of Node async event driven architecture. Many buil-in modules inherites from Event Emitter.
  • Emitter object has two main features emitting named events logger.emit('error:event') and register listener functions logger.on('error:event', listenerFn)

Event Emitter vs. Callback ?

  • By using Event Emitter we can react on the signal in multiple places of our application by registering listeners where Callback would be called only for assosiative function.

Debugging

Built-in debugger in CLI

  1. run node debug index.js, debugger will listen on 127.0.0.1:5858.
  2. use help to see available commands for debugger.
  3. use cont to continue debugging.
  4. use restart to restart debugging.
  5. use sb(2 /*line number*/) to put a breakpoint where debugger will stop.
  6. use repl when breakpoint is activated to inspect anything accessable to the script at that point. Ex. type argument name of the function or varialble name and it repl will output the result.
  7. use watch(arg /*variable name to watch*/) to watch a value of the variable and do not breake everytime. It is handy when debugging loops.

Built-in debugger in Browser

run node --inspect --debug-brk index.js it will output the url, cp this url and paste into your browser and you have fully functionall debugger with a lot of features.

Streams

Working with big amount of data in Node.js means working with Streams.

Streams in Node.js gives you the power of composability in your code, just like you can compose powerfull Linux commands by piping other smaller commands, ex: $ git grep require | grep -v // wc -l, so you can do the same in Node.js with Streams.

Streams are simply collections of data that might not be available all at once and don't have to fit in memory.

There are four fundamental type of Streams in Node.js: Redable, Writable, Duplex and Transform.

Readable Stream

A Readable stream is an abstraction for a source from which data can be consumed. An example of that is fs.createReadStream function of fs module.

Readable Stream events:

  • data - emitted whenever the stream passes a chunk of data to the cunsumer
  • end, - emitted when there is no more data to be consumed from the stream
  • error,
  • clsoe,
  • readable.

Readable Stream functions:

  • pipe(), unpipe(),
  • read(), unshift(), resume(),
  • pause(), isPaused()
  • setEncoding().

Readable Streams can be either in poused mode or in flowing mode. These are something referred to as pull vs. push modes.

Writable Stream

A Writable stream is an abstraction for a destination to which data can be written. An example of that is fs.createWriteStream function of fs module.

Writable Stream events:

  • drain - a signal that the writable stream can receive more data
  • finish - emitted when the all the data has been flashed to underlying system.
  • error,
  • close,
  • pipe,
  • unpipe.

Writable Stream functions:

  • write(),
  • end(),
  • cork(),
  • uncork(),
  • setDefaultEncoding().

Duplex Stream

Duplex streams are both Readable and Writable, like a socket for example.

Transform Stream

Transform streams are basically Duplex streams that can be used to modify or transform the data as it is written and read.

An example of that is the zlib createGzip stream to compress the data using gzip.

You can think of Transsform stream as a function where the input is the Writable stream part and the output is the Readable stream part.

All Streams are instances of EventEmitter. They all emit events that we can use to write or read data from them. However, we can consume streams in a simple wat using pipe method, ex: src.pipe(dst) /* src - readable stream, dst - writable stream.

Piping

Linux: a | b | c | d

Node: a.pipe(b).pipe(c).pipe(d); or a.pipe(b); b.pipe(c); c.pipe(d);

Implementing Streams

Stream implementers are usually who use a stream module.

Consuming Streams

For consuming, all we have todo is either use pipe or listen to stream events

Implementing Writable Stream

const { Writable } = require('stream');

// it will echo whatever you type to the console.
const echoStream = new Writable({
   write(chunck, encoding, callback) { // required option to implement writable stream.
      console.log(chunck.toString());
      callback();
   }
})

To consume above stream, we can simply use process.stdin.pipe(echoStream);

echosStream is not really usefull, the same echo functionality we can implement by using process.stdout:

process.stdin.pipe(process.stdout), now it will do the same echo whatever you type to console.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment