Skip to content

Instantly share code, notes, and snippets.

@v0lkan
Created January 19, 2019 06:13
Show Gist options
  • Save v0lkan/cde36a9d9c56b4b96a52d1f2303b06f2 to your computer and use it in GitHub Desktop.
Save v0lkan/cde36a9d9c56b4b96a52d1f2303b06f2 to your computer and use it in GitHub Desktop.
Promise Loop
/*
* \
* \\,
* \\\,^,.,,. JavaScript: from Zero to Hero
* ,;7~((\))`;;,, <zerotoherojs.com>
* ,(@') ;)`))\;;', an extraordinary course to learn JavaScript
* ) . ),(( ))\;, and related technologies
* /;`,,/7),)) )) )\,,
* (& )` (,((,((;( ))\,
*
*/
// Creating a promise chain is not strictly recursive because each
// `then(fn)`’s `fn` is executed in its own stack.
//
// HOWEVER; the resulting promise will need to maintain a chain of
// resolutions, so that once (if) the promise chain eventually resolves,
// it will pass the resolution value all the way up to the first promise.
//
// That’s why, if you don’t limit the depth of your promise chains you
// may leak memory and even crash the entire Node.js process if there’s
// not enough heap space left for Node.js to consume.
//
// Here is a one-liner to exhaust your Node.js app and crash it.
const loop = async () => Promise.resolve(true).then(loop);
loop();
// Terminal output will be similar to this:
//
//
// <--- Last few GCs --->
//
// [31436:0x103800c00] 17448 ms: Scavenge 1388.2 (1423.2) -> 1387.8 (1423.7) MB, 1.9 / 0.0 ms (average mu = 0.162, current mu = 0.129) allocation failure
// [31436:0x103800c00] 17454 ms: Scavenge 1388.5 (1423.7) -> 1388.1 (1424.2) MB, 5.2 / 0.0 ms (average mu = 0.162, current mu = 0.129) allocation failure
// [31436:0x103800c00] 17456 ms: Scavenge 1388.8 (1424.2) -> 1388.4 (1425.2) MB, 2.3 / 0.0 ms (average mu = 0.162, current mu = 0.129) allocation failure
//
//
// <--- JS stacktrace --->
//
// ==== JS stack trace =========================================
//
// 0: ExitFrame [pc: 0x3113334fb7d]
// 1: StubFrame [pc: 0x3113333e7f8]
// 2: StubFrame [pc: 0x3113331c6d2]
// 3: EntryFrame [pc: 0x31133305c9e]
// 4: ExitFrame [pc: 0x3113334fb7d]
// Security context: 0x2e871991d969 <JSObject>
// 5: _tickCallback [0x2e87f5993b49] [internal/process/next_tick.js:43] [bytecode=0x2e875eb3c269 offset=49](this=0x2e87827823d9 <process map = 0x2e875fccc941>)
// 6: /* anonymous */ [0x2e87bb559791] [internal/m...
//
// FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
// 1: 0x10003a9d9 node::Abort() [/usr/local/bin/node]
// 2: 0x10003abe4 node::FatalTryCatch::~FatalTryCatch() [/usr/local/bin/node]
// 3: 0x10019ed17 v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [/usr/local/bin/node]
// 4: 0x10019ecb4 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/usr/local/bin/node]
// 5: 0x1005a5882 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/usr/local/bin/node]
// 6: 0x1005a4838 v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/usr/local/bin/node]
// 7: 0x1005a2443 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/usr/local/bin/node]
// 8: 0x1005aecbc v8::internal::Heap::AllocateRawWithLightRetry(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/usr/local/bin/node]
// 9: 0x1005aed3f v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/usr/local/bin/node]
// 10: 0x10057dfc4 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/usr/local/bin/node]
// 11: 0x100832070 v8::internal::Runtime_AllocateInNewSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/usr/local/bin/node]
// 12: 0x3113334fb7d
// 13: 0x3113333e7f8
// 14: 0x3113331c6d2
@joeyhub
Copy link

joeyhub commented Oct 19, 2019

I figured in the end you're right though there are a lot of details here. It wasn't obvious when just jumping to the code and assuming the opening comments were noise. At first it looks like a normal infinite loop.

There is a lot of scope for confusion around what's meant by async. Async is a concept with an approximate or rough definition but it becomes specific with certain implementations. With promises the implementation of then's async is a contrivance and it's a different implementation to how async traditionally works in JS. This has the following result:

  • It can be hard to say under what meaning of async it qualifies.
  • It cannot be assumed to necessarily exhibit the same properties as other implementations.

It's a kind of pseudo async, you can technically simulate what it does fully with synchronous code with some technicality over what is happening with the stack. However, if it states that it truncates the stack then it should do so and should not be expected to overflow in any manner.

The promise chain doesn't overflow the stack though it is being used with the intention of restoring the function stack with async/await which it doesn't do properly because it's a queue rather than a stack making another area of confusion. It approximates asynchronicity and it approximates callstacks with unspecified behaviour in many cases. Pseudo callstacks and pseudo async.

Your example here shows yet a further peculiarity arising from basing async functions on Promise/A+. The chaining makes it like having a hidden await which has the effect of creating a function stack in the form of promise chains / a tree of listeners. With async functions the idiosyncrasies of Promise/A+ tend to be revealed.

I believe you're ending up with something approaching this:

loop = async () => (async () => await loop())()

Since then and await are effectively the same.

I don't think you found an unambiguous memory leak but you found the way in which async is really just a shorthand for using promises in an unconventional way rather than straight up functions that can do something that's prone to lead to inadvertent memory leaks. As said in all my code I've been avoiding returning promises like this because I don't trust what happens when they meld which is different to returning a straight value. Whenever I have an async function immediately returning a promise I make it a sync function and never have any more promise nesting than needed or expected. I don't think I ever return anything I don't intend to return. All my arrow functions are void if I didn't intend to return something.

Though to be certain, I would try this out with a sync loop function. I have a feeling the same will happen with:

Promise.resolve().then(() => Promise.resolve().then(() => Promise.resolve().then(etc));

I think in my promise implementations and specifications when I put chain as false (false by default in my implementation as things should never do things unless asked to) it stops that though it was five years or more since I made that. I think dancing between weak refs and strong refs would also cover some cases but in a convoluted way.

I wouldn't entirely disqualify this from being a leak because I believe even with the chain there's stuff left only reffed in the chain that could be cleared up. To be more specific, you have items unreferenced except within the chain that are also unreachable within the chain. That makes it a candidate for collection. A more subtle suboptimal case arises where A..Z can be reduced to (A, Z). Truncating is normal but chopping off the head and tail then joining them is harder.

I'd also test with this:

const loop = async () => {Promise.resolve(true).then(loop)};

You might find this interesting.

@v0lkan
Copy link
Author

v0lkan commented Oct 21, 2019

Thanks for the link. I’ll check it out.

Nope I don’t claim I’ve found an unambiguous memory leak.

If anything, it is a more-or-less obvious logical leak.

So it depends on the definition of a leak; but from a practical standpoint it sure is leaking and crashing the system ;)

A non-strict definition of a leak can be: Anything that increases memory utilization without bound, and will likely crash or starve the system if not kept in check.

Cheers,

V.

@joeyhub
Copy link

joeyhub commented Oct 21, 2019

If I get the time later I'll see if I can make a minimal promise implementation that can truncate using weakrefs or see if it would need destructors.

If I can get it to collect that will answer the question less ambiguously.

@joeyhub
Copy link

joeyhub commented Oct 21, 2019

"use strict";
var queue = [];

let l = 0;

function LeakproofPromise() {
    var tooSoon, result;

    if(++l % 1000000 === 0)
        console.log(`L: ${l} ${d} ${i}`);

    this.resolve = (...arg) => {
        if(arg.length > 1 || result)
            throw new Error();

        result = arg;

        if(tooSoon) {
            queue.push(...tooSoon.map(tooSoon => [tooSoon, arg]));
            tooSoon = undefined;
        }
    };

    this.then = (callback) => {
        const child = new LeakproofPromise();

        if(result)
            queue.push([[callback, child], result]);
        else {
            if(!tooSoon)
                tooSoon = [];

            tooSoon.push([callback, child]);
        }

        return child;
    };
}

let d = 0;

const f = () => {
    if(++d % 1000000 === 0)
        console.log(`D: ${l} ${d} ${i}`);

    const a = new LeakproofPromise(), b = new LeakproofPromise();
    b.resolve();
    const c = b.then(f);
    a.resolve(c);
    return a;
};

(() => {
    const p = new LeakproofPromise();
    p.resolve();
    p.then(f);
})();

const stack = [];

const toStack = () => {
    if(!queue.length)
        return;

    //stack.push(...queue.reverse());
    stack.push(...queue);
    queue = [];
};

toStack();

let i = 0;

while(stack.length) {
    if(++i % 1000000 === 0)
        console.log(`I: ${l} ${d} ${i}`);

    //const [callback, arg] = stack.pop();
    const [[callback, child], arg] = stack.shift(), result = callback(...arg);

    if(result)
        result.then(child.resolve);
    else
        child.resolve(result);

    toStack();        
}

Just a straight up basic promise implementation is smacking up the CPU core that drew the short straw something good for me but with constant memory. I am sure the heads (new Promise) can be collected.

Need to confirm this promise implementation doesn't have some other loop bug though or some logic fail.

I wonder about catch though. I'm finding it not so easy surprisingly to get it to leak.

Confirmed as another design flaw in Promise/A+: nodejs/node#6673

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment