Event Loop does a simple job - it orchestrates the work between the Call Stack and Event Queue untill the Event Queue is empty.
- It is a simple list of functions.
- A Stack is a simple FILO (First In Last Out) data structure.
- The top element that we can pop out, is the last element that we pushed into it (stack).
- Since Node is a Single Threaded, we have only one stack.
- It's a list of things (events) to be processed.
- A Queue data structure is the FIFO (First In First Out).
- The first event we queue will be the first event wll be de-queued.
- To de-queue and process an event from the edge of the Queue, we just invoke a function assosiated with it.
function foo() {
setTimout(() => { // This callback is putted on the Call Stack
console.log('process after 2 sec.'); // when it's done after 2 seconds it will be placed into the Queue.
// Exactly at this moment the Event Loop has smth. important todo,
// The Event Loop job is very simple - it monitors the stack of Call Stack
// and Event Queue. When the Call Stack is empty and the Event Queue is not
// it will de-queue the event from the Event Queue and place it on the Call Stack.
// Call Stack will invoke this callback for assosiative function and pop out this
// callback out of the Call Stack. Call Stack and Event Queue are empty now, event
// so, Event Loop doesn't need to process anymore. All Node APIs works with this
// concept.
}, 2000);
}
- Event Emitter is a module that facilitiates communication between the objects in Node.
- Event Emitter is at the core of Node async event driven architecture. Many buil-in modules inherites from Event Emitter.
- Emitter object has two main features emitting named events
logger.emit('error:event')
and register listener functionslogger.on('error:event', listenerFn)
Event Emitter vs. Callback ?
- By using Event Emitter we can react on the signal in multiple places of our application by registering listeners where Callback would be called only for assosiative function.
- run
node debug index.js
, debugger will listen on 127.0.0.1:5858. - use
help
to see available commands for debugger. - use
cont
to continue debugging. - use
restart
to restart debugging. - use
sb(2 /*line number*/)
to put a breakpoint where debugger will stop. - use
repl
when breakpoint is activated to inspect anything accessable to the script at that point. Ex. type argument name of the function or varialble name and itrepl
will output the result. - use
watch(arg /*variable name to watch*/)
to watch a value of the variable and do not breake everytime. It is handy when debugging loops.
run node --inspect --debug-brk index.js
it will output the url, cp this url and paste into your browser and you have fully functionall debugger with a lot of features.
Working with big amount of data in Node.js means working with Streams.
Streams in Node.js gives you the power of composability in your code, just like you can compose powerfull Linux commands by piping other smaller commands, ex: $ git grep require | grep -v // wc -l
, so you can do the same in Node.js with Streams.
Streams are simply collections of data that might not be available all at once and don't have to fit in memory.
There are four fundamental type of Streams in Node.js: Redable, Writable, Duplex and Transform.
A Readable stream is an abstraction for a source from which data can be consumed.
An example of that is fs.createReadStream
function of fs
module.
Readable Stream events:
data
- emitted whenever the stream passes a chunk of data to the cunsumerend
, - emitted when there is no more data to be consumed from the streamerror
,clsoe
,readable
.
Readable Stream functions:
pipe()
,unpipe()
,read()
,unshift()
,resume()
,pause()
,isPaused()
setEncoding()
.
Readable Streams can be either in poused mode or in flowing mode. These are something referred to as pull vs. push modes.
A Writable stream is an abstraction for a destination to which data can be written.
An example of that is fs.createWriteStream
function of fs
module.
Writable Stream events:
drain
- a signal that the writable stream can receive more datafinish
- emitted when the all the data has been flashed to underlying system.error
,close
,pipe
,unpipe
.
Writable Stream functions:
write()
,end()
,cork()
,uncork()
,setDefaultEncoding()
.
Duplex streams are both Readable and Writable, like a socket for example.
Transform streams are basically Duplex streams that can be used to modify or transform the data as it is written and read.
An example of that is the zlib
createGzip
stream to compress the data using gzip
.
You can think of Transsform stream as a function where the input is the Writable stream part and the output is the Readable stream part.
All Streams are instances of EventEmitter. They all emit events that we can use to write or read data from them. However, we can consume streams in a simple wat using pipe
method, ex: src.pipe(dst) /* src - readable stream, dst - writable stream
.
Linux: a | b | c | d
Node: a.pipe(b).pipe(c).pipe(d);
or a.pipe(b); b.pipe(c); c.pipe(d);
Stream implementers are usually who use a stream
module.
For consuming, all we have todo is either use pipe or listen to stream events
const { Writable } = require('stream');
// it will echo whatever you type to the console.
const echoStream = new Writable({
write(chunck, encoding, callback) { // required option to implement writable stream.
console.log(chunck.toString());
callback();
}
})
To consume above stream, we can simply use process.stdin.pipe(echoStream)
;
echosStream
is not really usefull, the same echo functionality we can implement by using process.stdout
:
process.stdin.pipe(process.stdout)
, now it will do the same echo whatever you type to console.