Skip to content

Instantly share code, notes, and snippets.

@chetandhembre
Last active August 29, 2015 14:06
Show Gist options
  • Save chetandhembre/48bf6c88c07c27769a45 to your computer and use it in GitHub Desktop.
Save chetandhembre/48bf6c88c07c27769a45 to your computer and use it in GitHub Desktop.
Performace of node.js and GC working
var globalTime = new Date().getTime();
var obj;
var a = function(){
var i;
for(i=0;i<=12000;i++){
obj = {};
obj = null;
}
}
var c = function(){
var i;
for (i=0;i<12000;i++){
obj = {};
obj = null;
}
}
var b = function(callback){
var date = new Date().getTime();
var i;
for (i=0;i<10000;i++)
callback();
console.log("Completed a Function Test in", new Date().getTime()-date);
}
console.log("Defined Everything in", new Date().getTime()-globalTime);
b(a);
b(c);
console.log("Program End at", new Date().getTime()-globalTime);
#find time require to run new named.js
time node --prof named.js
Defined Everything in 6
Completed a Function Test in 936
Completed a Function Test in 972
Program End at 1917
real 0m1.964s
user 0m2.007s
sys 0m0.033s
#run node-tick-processor .. only relevant output is show below
Statistical profiling result from v8.log, (1913 ticks, 1435 unaccounted, 0 excluded).
[Unknown]:
ticks total nonlib name
1435 75.0%
[Shared libraries]:
ticks total nonlib name
441 23.1% 0.0% /usr/bin/nodejs
35 1.8% 0.0% /lib/x86_64-linux-gnu/libc-2.19.so
1 0.1% 0.0% 7fff7a5fe000-7fff7a600000
1 0.1% 0.0% /lib/x86_64-linux-gnu/libpthread-2.19.so
[JavaScript]:
ticks total nonlib name
[C++]:
ticks total nonlib name
[GC]:
ticks total nonlib name
167 8.7%
[Bottom up (heavy) profile]:
Note: percentage shows a share of a particular caller in the total
amount of its parent calls.
Callers occupying less than 2.0% are not shown.
ticks parent name
441 23.1% /usr/bin/nodejs
#find time require to run new named.js
time node --prof namedFunction.js
Defined Everything in 7
Completed a Function Test in 480
Completed a Function Test in 474
Program End at 963
real 0m1.012s
user 0m1.025s
sys 0m0.025s
#run node-tick-processor .. only relevant output is show below
Statistical profiling result from v8.log, (981 ticks, 742 unaccounted, 0 excluded).
[Unknown]:
ticks total nonlib name
742 75.6%
[Shared libraries]:
ticks total nonlib name
215 21.9% 0.0% /usr/bin/nodejs
23 2.3% 0.0% /lib/x86_64-linux-gnu/libc-2.19.so
1 0.1% 0.0% /lib/x86_64-linux-gnu/libpthread-2.19.so
[JavaScript]:
ticks total nonlib name
[C++]:
ticks total nonlib name
[GC]:
ticks total nonlib name
69 7.0%
[Bottom up (heavy) profile]:
Note: percentage shows a share of a particular caller in the total
amount of its parent calls.
Callers occupying less than 2.0% are not shown.
ticks parent name
215 21.9% /usr/bin/nodejs
23 2.3% /lib/x86_64-linux-gnu/libc-2.19.so
#running un optimized named.js file
time node --prof namedFunction.js
Defined Everything in 7
Completed a Function Test in 47500
Completed a Function Test in 49747
Program End at 97256
real 1m37.304s
user 1m35.975s
sys 0m4.504s
#run node-tick-processor .. only relevant output is show below
Statistical profiling result from v8.log, (94560 ticks, 7178 unaccounted, 0 excluded).
[Unknown]:
ticks total nonlib name
7178 7.6%
[Shared libraries]:
ticks total nonlib name
84583 89.4% 0.0% /usr/bin/nodejs
2135 2.3% 0.0% /lib/x86_64-linux-gnu/libpthread-2.19.so
657 0.7% 0.0% /lib/x86_64-linux-gnu/libc-2.19.so
5 0.0% 0.0% 7fff7397c000-7fff7397e000
1 0.0% 0.0% /usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.19
1 0.0% 0.0% /lib/x86_64-linux-gnu/libm-2.19.so
[JavaScript]:
ticks total nonlib name
[C++]:
ticks total nonlib name
[GC]:
ticks total nonlib name
13382 14.2%
[Bottom up (heavy) profile]:
Note: percentage shows a share of a particular caller in the total
amount of its parent calls.
Callers occupying less than 2.0% are not shown.
ticks parent name
84583 89.4% /usr/bin/nodejs
2135 2.3% /lib/x86_64-linux-gnu/libpthread-2.19.so
@chetandhembre
Copy link
Author

I think your going deep in v8 unnecessarily .

  1. V8 garbage collector knows and optimized it when to clean up memory. So doing your own memory cleaning is not recommended because V8 manages it's object in memory quit differently to make it's access fast. check out this
  2. incase of your concern about deleting object on own should make thing great then read this. delete obj never clean up memory but just removed reference. So memory is still allocated there which will be freed by GC on next call. Even assigning null wont help here . So my advise will be let GC do its job.
  3. For callback (pthread) stuff .. closure may give you great power but it causes great memory consumption. check here
  4. I have not saw your code so I can not comment on app freezing in problem. But there are tools which can help you to find bottleneck. In my experience about 99% we ( developer) do not follow rules or try write some very unoptimized code. Many companies like walmart, paypal, ebay, linkedIn are using node.js in production. And they have pretty big traffic. So I will suggest you to diagnose problem.
  5. there will be no performance difference when using null and delete as I mention reason in point 2. Check this jsperf test results.
  6. and yes it's not about your breakup letter for node.js, It is your decision go ahead with it. But just know what was exactly happening there. Be aware.

@DronRathore
Copy link

Thanks for the Egorov's blog link, had my cup of tea for today(was looking for some links that makes me dive deeper). Well the obj = null has a thing to do with v8, as for assigning null, v8 actually calls this v8::internals::HeapScavange and deliberately clears a section, where on the other hand delete obj, marks an entry for v8 gc to do his work later when it gets the call and it decreases the object counter, whereas assigning null will not do that(for that particular point of time, and later gc has to figure out that on its own about object references) so by using delete we are paying a cost of time of gc at the very next processTick which is quite doable, as a section that we just released will be cleared momentarily but in case of using null, this cost has to be paid later together with resolution of Object Ref. Counter resolution and then cleaning the part(if haven't) which is blocking.

As NodeJS is known for its non-blocking IO operations and using async to gain up performance and we actually do try to use non-blocking calls, but GC itself is a blocking call(for which we can't do anything) so need to be taken care of.

Its not that we haven't tried figuring out things while developing our nodeJS server but from time-to-time we have to resolve glitches, few just unknowingly caused which takes another step of analysing the heap and call trees, for a developer to spend more time on analysing the problem that caused because of internal architectures is fun only iff he is sitting on bench but not for someone who has to move with pace. Well its doable to have our NodeJS server live and working, I can always run clusters under Phusion Passenger and that will make things work for me, just I have to be vigilant about keeping an eye on Mem% stuff on atleast a weekly basis, which is doable as being an agile firm we have to deploy regressively everyday, so that works for me. But, I have to code sockets, make services that needs to be robust and talks in between, handles lakhs of client requests at a given time, coding sockets on NodeJS took me to another level of either using streams or some helping library because most of the time packets just gets OOB(Out of Bound) for it, so I did ended up dealing most of the things, but even today if someone walks up to me to have some quick and dirty service/addon to be coded in office, I code that in nodeJS.

Out of all the examples you gave of industry people using NodeJS, let me be clear they haven't moved everything on it, they gave a part of their problem domain to nodeJS because its risky to give whole of your service to node and rely on it (its still unreliable Until we see a stable V1.0 of Node being released) e.g. Paypal implemented only the Dashboard in nodeJS, WallMart did for API's.

Hope that makes sense. :)
Its great to have tech talks with you, I don't often find people taking things seriously. Kudos!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment