Create a gist now

Instantly share code, notes, and snippets.

@getify /gist:5253319
Last active Apr 26, 2016

What would you like to do?
exploring what a `currentThis` might look like in JS
// This is a theoretical exploration of something that could
// be added to JS. It borrows from JS events which have both a
// `target` and `currentTarget` element, which distinguish between
// the direct target of an event and the delegated parent handling
// the event, respectively.
// The idea is to distinguish between `this` and `currentThis`,
// respectively, as...
// `this`: the initial containing object for a call that initiates a
// traversal of the [[Prototype]] chain (just as it currently does)
// `currentThis`: the actual link in the [[Prototype]] chain where the
// traversal resolution is completed.
// Notice below that with `bar1` for example, `this` will point to
// `bar1` even when using the delegated `identify()` function found
// on `Foo`, but `currentThis` will point to `Foo`, since that's
// where `identify()` was found in the chain.
// So... what if:
var Foo = Object.create(null); = "Foo";
Foo.identify = function() {
// whenever Foo#identify() is called, `currentThis` will
// always be `Foo`, but `this` will continue to follow the
// established rules for determining a `this` binding.
console.log("Me: " + + "; Current: " +;
var Bar = Object.create(Foo); = "Bar";
Bar.another = function() {
console.log("Current: " +;
var bar1 = Object.create(Bar); = "bar1";
var bar2 = Object.create(Bar); = "bar2";
Foo.identify(); // "Me: Foo; Current: Foo"
Bar.identify(); // "Me: Bar; Current: Foo"
Bar.another(); // "Current: Bar"
bar1.identify(); // "Me: bar1; Current: Foo"
bar2.identify(); // "Me: bar2; Current: Foo"
bar1.another(); // "Current: Bar"
bar2.another(); // "Current: Bar"

getify commented Mar 27, 2013

Some other semantic/behavioral notes:

  1. I do not intend to suggest that currentThis would be lexically set. It would be set based on wherever a function is found in a [[Prototype]] chain lookup. In the above example, if another object Baz borrowed a "copy" to (reference of) the Foo.identify() function, and then a call was made that ended up resolving to Baz#identify(), currentThis would be Baz, not Foo.
  2. If a function references a currentThis and that function is called in such a way that no [[Prototype]] chain resolution happens (it's called directly, or whatever), then currentThis is just the same value as whatever this would be (according to established this rules).
  3. Since currentThis would be sort of "statically" bound (based on the [[Prototype]] chain only), that means it creates a mechanism for "statically" walking up that chain one level at a time, via __proto__ (or Object.getPrototypeOf()). Unlike this, which can be dynamically bound, and can easily create an infinite circular recursion, currentThis.__proto__ would always point one link up the [[Prototype]] chain, and thus that would serve the common semantic of a super call. For example:
var Baz = Object.create(Foo); = "Baz";
Baz.identify = function() {
   console.log("I'm " + + "; Current: " +;

   // using `this` here could create an infinite recursion
   // but `currentThis` is always "safe" in that respect
   currentThis.__proto__.identify(); // calls Foo#identify()

var baz1 = Object.create(Baz); = "baz1";
baz1.identify = function() {
   console.log("Yeah, it's " + + "; Current: " +;
   currentThis.__proto__.identify(); // calls Baz#identify()

// "Me: Foo; Current: Foo"

// "I'm Baz; Current: Baz"    "Me: Baz; Current: Foo"

// "Yeah, it's baz1; Current: baz1"     "I'm baz1; Current: Baz"     "Me: baz1; Current: Foo"

getify commented Mar 27, 2013

A sort-of-polyfill PoC for currentThis (obviously inefficient):

allenwb commented Mar 27, 2013

What you are suggesting here is very similar to one alternative for defining the semantics of "super" for JavaScript. Essentially, the "currentThis" value is the object whose [[Prototype]] is used as the starting point for looking up "foo" in an expression like:

What currentThis provides could also be describe as "dynamic super binding". We considered this alternative for ES6 "super" semantics. However, it has the problem that it imposes significant additional overhead on every function call, whether or not the called function actually references "super". The call-site doesn't know anything about the callee so it has to assume that the callee contains a "super" lookup. We would have to go from having one implicit argument (this) for every function call to having two implicit arguments (this and currentThis). This was just too much of a performance penalty.

Instead we went with "static super binding" where each function that reference "super" has a static reference to the object that holds that function as an own property value. This still provides the normally expected super lookup semantics and eliminates the every call performance penalty. It is also the technique used by most other languages that support "super" method invocations. But it comes at the price of added complexity when performing reflection operations upon such functions. When a super bound method is installed as a own method in another object a new function object instance with its own super binding must be generated.

have a look at this:

which solves the problem in a ES6 like way through this.super(arg1, arg2, argN); and makes var later = this.super;, 1, 2, 3); thanks to the self bound method ... it works through the caller, as Allen said, and in a less efficient way than ES6


getify commented Mar 28, 2013

@allenwb - thanks for the insights. I have a few questions and clarifications...

  1. You said:

    Essentially, the "currentThis" value is the object whose [[Prototype]] is used as the starting point for looking up "foo" in an expression like:

    I am a little confused by your statement here, but it's probably just me reading it wrong. What you say here seems to me like the opposite (of sorts) of what currentThis is intended to be. Basically, I was saying currentThis would be the ending point of the [[Prototype]] chain lookup, which theoretically starts at whatever this points to (the original object instance, etc), and walks its way up the chain, until it finds foo, and whatever object it first finds foo on would be currentThis.

    Maybe that's exactly what you meant, but I read "starting point" as to be headed in the opposite direction. :)

    Moreover, you kind of compare super to currentThis, and I think that's also slightly off from my intention, as in my case super === currentThis.__proto__. Again, that may be exactly what you meant, just clarifying from my confusion.

  2. Is the performance hit for currentThis that passing a second separate item into the execution context (both this and currentThis) would be costly? Or are you saying that tracking where you are in the [[Prototype]] chain traversal (so you have a value for currentThis) is the performance hit?

    In my totally inadequate understanding of implementation details, it seems like you already have to walk up the [[Prototype]] chain (I'm imagining a linked-list sort of thing) looking for foo at each level, and then when you find foo, you call it. In that way of things, you already know what currentThis is, because it's whichever level of the chain of objects that you just finished traversing and found foo on.

    I'd be a tad surprised if merely adding another parameter (where value is already known/obvious) into the context was the performance hit, as compared to the tracking during traversal, but just trying to understand exactly where this perf hit would occur.

    If tracking where you are in the chain, and/or passing in where you ended up in the chain as a named argument, is the perf hit, maybe something like this could have helped?

    Here's a very naive inefficient sort-of-polyfill for currentThis, implemented as a function (maybe a getter?) instead of a special variable, which does the traversal at access time (rather than ahead of time) by starting at this and walking up the chain to find the foo.

    Was just wondering if some sort of thing like that (basically lazy-resolution) could have resolved some of those super perf concerns by shifting the perf hit to the access time of currentThis (or super) as a getter function of sorts, instead of pre-calculating it? That way not all function calls would be slowed down, only those using the new functionality.

Thanks again for the expert (and patient!) input. :)

allenwb commented Mar 28, 2013

I was describing pretty much the same thing, but from the perspective of super . Your currentThis is indeed the end point of a lookup. However, if the method that was found does a super property access, the lookup for that access needs to start at the[[Prototype]] of currentThis. Otherwise, a super call with the same property name as the original method would loop. In the ES6 semantics we capture currentThis, rather than currentThis.__proto__ to account for the possibility that the [[Prototype]] of the object initially identified as currentThis could be dynamically modified (using __proto__) prior to the actual super lookup..

How the cost is accrued might vary depending upon implementation techniques. Regardless, either an additional implicit parameter has to be passed on every function call or the caller has to somehow dynamically examine the callee and make a determination that it doesn't need to pass currentThis. It's probably actually cheaper to just always pass it. If a typical function has approximately one argument plus this, adding currentThis is a 50% penalty added to the parameter passing overhead of a typical function call.

There really isn't any additional overhead for tracking the lookup as you already have to keep the initial this and your need to know which element of the [Prototype]] chain you are current searching and that is the currentThis. It's really the extra parameter passing that's the overhead.

Parameter passing overhead is a significant fraction of the function call cost in most languages, so implementers like to avoid adding to the fixed overhead. The actual lookup results is probably going to be cached so its cost can be kept low.

Your lazy relookup to determine currentThis only when it's actually referenced breaks if properties are added/removed or the [[Prototype]] chain is dynamically modified after the call but before the reference.


getify commented Mar 28, 2013

@allenwb Thanks again for the wonderful detailed explanation. Sheds a lot of light on things.

One last point of curiosity: you've mentioned a couple of times that the decision of if a function needs a super or currentThis passed to it is a dynamic one. I'm trying to figure out why that would be the case?

It seems from my naivety that a static analysis of the function at compile time (basically, the function has to already have been analyzed before execution, right?) could flag it as "needingCurrent" or not. If that were possible once, at compile time, it seems like it would be "cheaper" than having to dynamically inspect (or just always pass currentThis). And if that were the case, would perf be roughly similar to the static lexical super binding you mentioned was decided for ES6?

What circumstances or code constructs (besides eval/with) could cause the engine to have to dynamically determine the need for currentThis?

In any case, I'll just trust your expertise that it was looked at and couldn't be made to work. Bummer though. I think a currentThis would be helpful in several cases.


getify commented Mar 28, 2013

@allenwb one other thought:

Your lazy relookup to determine currentThis only when it's actually referenced breaks if properties are added/removed or the [[Prototype]] chain is dynamically modified after the call but before the reference.

If I am understanding you correctly, you're suggesting that currentThis would have been statically "fixed" at compile time, and that my lazy-lookup at runtime could give wrong answers?

Actually, I think that's specifically what I intended for currentThis, which is that the author-time decisions are irrelevant, and the actual run-time lookup on the [[Prototype]] chain is what will determine where currentThis ends up pointing to.

Indeed, if you were to dynamically re-arrange the [[Prototype]] chain, I would want currentThis to reflect the "situation on the ground" rather than from a compile-time assumption.

Similarly, imagine if you were to use the "mixin" pattern on the example above, and were to put a reference to the identify() function (as he currently exists only on Foo) directly onto the Bar object, and then do the exact same bar1.identify() call. NOW the function would be found in the lookup at an earlier step in the lookup chain than before, at Bar rather than at Foo, and so I'd want for that invocation of identify() to bind currentThis to Bar, not Foo.

In other words, currentThis is intended to be a call-time binding, not a compile-time/lexical binding.

In that sense, then, isn't the lazy-lookup I showed (perf concerns aside) the correct semantic for currentThis?

EDIT: I just realized what you probably meant. You're probably talking about if you change properties or re-wire the [[Prototype]] chain from inside the function itself, right? Yeah, that's tricky, tricky. Not sure whether I'd want the currentThis to be modifiable at that point or not. I can see arguments for both. :)

EDIT #2: I updated my little sorta-polyfill to match a better semantic, where the lookup is done once, as the first line of the function, so shouldn't be susceptible to weirdness about changes in properties or [[Prototype]] chain.

allenwb commented Mar 28, 2013


Whether a particular function uses super (or currentThis) can be statically determined when the function is parsed and that fact can be record somewhere. Most likely in some sort of descriptor that is accessible relative to the address of the function.

However, at a call-site we generally don't know what particular function is being called because the function is obtained via a property lookup or a parameter value, etc. So if the call site wants to only pass the currentThis``value to functions that actually use it, it needs to inspect the target function's descriptor to see if it usescurrentThisand then conditionally pass thecurrentThis` value based upon what was recorded in the descriptor.

There are a lot of degrees of variability at this "close to the metal" level of design, but a reasonable simplification of the implementation alternatives are:

  1. Always unconditionally pass the currentThis value. This has an idealized cost of 1 memory store on every call.
  2. Only pass currentThis if the target function is flagged as needing it. This has an idealized cost of 1 memory read + 1 test + 1 conditional branch on every call. If the branch is taken because the target function actually needs currentThis then there will be a store and a branch back to the fast path.

Stores are expensive, but so are reads and branches, so at this level of abstraction we might as well say that the two alternatives add approximately the same base overhead to every function call and that overhead is approximately that of one additional argument.

allenwb commented Mar 28, 2013

Yes, I was referring to __proto__ tampering that might take place after the call but before the reference to super. An arbitrary large amount of code might execute between those two points include code that does loading, code-generation, dynamic mix-ins, etc.

It's not always obvious what the best semantics are in situations like this. However, in general JS property accesses are defined to reflect the state of objects at the point when the access is made. This allows for the possibility that object structure is dynamically highly mutable. It is probably best to minimize exceptions to that principle.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment